A research team from NYU Tandon School of Engineering has developed a belt equipped with vibrational and sound feedback to overcome the limitations of traditional mobility aids and improve self-navigation.
A new hope has emerged for people who are blind or have low vision. Building on the work of John-Ross Rizzo, Maurizio Porfiri, and colleagues, a team from NYU Tandon School of Engineeringhas recently tested an innovative navigation system that combines vibrational and sound feedback.
This one-of-a-kind wearable device was designed to help people with serious vision impairment move independently in their familiar environment. Traditional mobility aids, such as white canes or guide dogs, can present significant limitations. White canes detect the presence of obstacles only through direct contact and within their range, while guide dogs require long and expensive training.
“Traditional mobility aids have key limitations that we want to overcome,” said Fabiana Sofia Ricci, the paper’s lead author and a Ph.D. candidate in NYU Tandon Department of Biomedical Engineering (BME) and NYU Tandon’s Center for Urban Science + Progress (CUSP). “As a result, only 2 to 8 percent of visually impaired Americans use either aid”.
In their study, published in JMIR Rehabilitation and Assistive Technology, the research team miniaturized a backpack-based haptic feedback system into a belt equipped with 10 precision vibration motors. The other electronic components, such as a circuit board and a custom-made microcontroller, were placed inside a compact waist bag, making the device practical and easy for everyday use.
The system’s sensory feedback is based on two different kinds of signals: the belt’s vibrations indicate the presence of a close obstacle, while audio signals, communicated through a headset, increase their frequency as users approach obstacles in their path.
“We want to reach a point where the technology we’re building is light, largely unseen and has all the necessary performance required for efficient and safe navigation,” said Rizzo, who is an associate professor in NYU Tandon’s BME department, associate director of NYU WIRELESS, affiliated faculty at CUSP and associate professor in the Department of Rehabilitation Medicine at NYU Grossman School of Medicine. “The goal is something you can wear with any type of clothing, so people are not bothered in any way by the technology.”
The system was tested on 72 participants with normal vision and no sight limitations, who wore Meta Quest 2 VR headsets and haptic feedback belts while walking inside the Media Commons at NYU. Inside an empty room fenced off by curtains, participants experienced, through virtual reality, an underground station, as people with advanced glaucoma would see it – reduced peripheral vision, blurred details, and altered colour perception.
Researchers assessed how participants succeeded in navigating this virtual environment using their belt’s vibrations and audio signals while a vision impairment condition was being simulated.
“We worked with mobility specialists and NYU Langone ophthalmologists to design the VR simulation to accurately recreate advanced glaucoma symptoms,” says Porfiri, the paper’s senior author, CUSP Director, and an Institute Professor in NYU Tandon’s Departments of BME and Mechanical and Aerospace Engineering. “Within this environment, we included common transit challenges that visually impaired people face daily – broken elevators, construction zones, pedestrian traffic, and unexpected obstacles.”
The study’s results were encouraging: haptic feedback significantly reduced collision with obstacles, while audio signals improved movement fluidity in the environment.
Further research will include participants who have completely lost their sight to better understand the potential use of these new aids by blind people.