Researchers Turn to AI to Create "Semantic Maps," Delivering Better Navigation for the Blind

Wearable system generates a simplified view with only the most important information highlighted, with haptic and auditory feedback too.

Researchers from the Johns Hopkins Applied Physics Laboratory and Whiting School of Engineering have developed a navigation system for blind and visually impaired users that taps artificial intelligence (AI) to create a semantic map of their environment — directions from which are fed back using a vibrating headband, voice prompts, and spatial sounds.

“Traditional navigation systems for the visually impaired often rely on basic sensor-based mapping, which can only distinguish between occupied and unoccupied spaces," explains lead researcher Nicolas Norena Acosta of the team's work. "The new semantic mapping approach, however, provides a much richer understanding of the environment, enabling high-level human-computer interactions."

The team's portable navigation system uses a combination of depth-sensing and visible-light cameras fed into a machine learning system, which can turn the data into a "semantic map" of the user's surroundings. For the partially sighted, this is then fed back as a simplified view of the area with only the most crucial information highlighted; for the fully blind, a vibrating headband combined with spoken-word and spatial sound cues performs the same task.

"The challenge was creating a system that could synchronize and process multiple types of sensory data in real time," Norena Acosta notes. "Accurately integrating the visual, haptic and auditory feedback required sophisticated algorithms and robust computing power, as well as advanced AI techniques."

"The potential impact of this work on patient populations is substantial," adds principal investigator Seth Billings of the team's work. "This could lead to greater social inclusion and participation in daily activities, ultimately enhancing the overall quality of life for blind and visually impaired individuals."

The team presented its work at the SPIE Defense + Commercial Sensing 2024 conference earlier this year; a clinical trial is currently underway, with results expected this summer.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles