Photo credit: www.sciencedaily.com
Innovative Apps Enhance Indoor Navigation for the Visually Impaired
Two newly developed smartphone applications aim to assist blind individuals in navigating indoor environments by providing audible directions, an essential feature where traditional GPS falters.
Roberto Manduchi, a professor in the Computer Science and Engineering department at UC Santa Cruz, has dedicated his research to developing technologies that enhance accessibility for the blind and visually impaired. Over his career, he has identified a crucial demand for support in indoor navigation, particularly in unfamiliar places.
“Traveling independently through unknown interiors poses significant challenges due to the lack of visual cues, increasing the risk of becoming disoriented,” Manduchi explained. “The goal of these applications is to simplify and secure this process for users.”
In a recent study published in ACM Transactions on Accessible Computing, Manduchi’s research group introduced two innovative smartphone applications designed for indoor navigation. These apps offer not only route guidance to specific destinations but also safe return functionalities, which help users retrace their steps. The applications provide audio feedback, allowing for hands-free operation, crucial for individuals who may need their hands for a cane or guide dog.
Advancements in Accessible Technology
Smartphones serve as a practical platform for accessible technologies due to their widespread availability, affordability, and built-in sensors that foster inclusive features.
Many existing smartphone navigation systems require users to hold their devices in front of them, a task that can be impractical. Blind individuals often have at least one hand occupied with mobility aids, making it cumbersome to handle a smartphone. Additionally, prominently displaying a phone could expose users to risks, a serious concern since individuals with disabilities are often more vulnerable to crime.
Although tech giants like Apple and Google have made strides in indoor navigation at select locations—such as airports and stadiums—their systems require extensive infrastructural investments, which limits scalability and widespread adoption.
Leveraging Smartphone Sensors
Manduchi’s indoor navigation applications operate similarly to traditional GPS systems but without relying on satellite signals that are easily obstructed by buildings. Instead, these apps utilize the smartphone’s internal sensors to deliver spoken navigation guidance.
The technology maps the interior of a building to chart a course to a specific destination, while the phone’s inertial sensors aid in tracking the user’s movements, contributing to functionalities like a step counter. These sensors also help to determine the orientation of the device, allowing for more precise navigation.
To enhance accuracy, the research team implemented a technique known as particle filtering, which prevents the system from erroneously suggesting impossible movements, such as walking through walls.
The second application focuses on reversing previously taken routes, which can be especially beneficial when a blind person wishes to exit a space independently after being guided in. It also utilizes the smartphone’s magnetometer to detect magnetic anomalies linked to larger appliances, helping create mental landmarks.
Enhancing Communication of Directions
Both applications convey navigational information through spoken commands and are compatible with smartwatches for added tactile feedback. The design prioritizes minimal input, enabling users to concentrate on their surroundings and safety.
Users are empowered to make their own navigation choices, with the system issuing directional prompts slightly ahead of expected turns, like saying, “at the upcoming junction, turn left,” allowing users to prepare accordingly with their mobility aids.
“I believe that shared responsibility is crucial,” Manduchi emphasized. “We cannot rely solely on technology, akin to how a driver should be aware when following GPS directions.”
The usability of these applications was tested in the Baskin Engineering building at UC Santa Cruz, where participants navigated multiple pathways successfully. The development team is committed to refining both applications, which currently share similar interfaces but are designed separately for ease of enhancement.
Looking ahead, Manduchi’s team plans to incorporate artificial intelligence features enabling users to take photographs of their surroundings and receive contextual descriptions. This update could be particularly beneficial in complex navigational areas within buildings. Furthermore, the team aims to improve access to downloadable building maps, potentially through an open-source framework.
“I greatly appreciate the guidance from the blind community in Santa Cruz,” Manduchi reflected. “As engineers, creating technology for visually impaired users requires a mindset grounded in humility, focusing initially on user needs rather than technology itself.”
Source
www.sciencedaily.com