Researchers from the University of Technology Sydney and the University of Sydney, in collaboration with Sydney-based startup ARIA Research, have developed innovative smart glasses inspired by bats’ echolocation abilities. These “acoustic touch” smart glasses convert visual information into unique sound representations, aiming to improve navigation for blind and visually impaired individuals.
Assistive technology plays a crucial role in providing solutions for individuals with sensory disabilities, especially those facing challenges due to blindness or low vision. The acoustic touch technology stands out from conventional smart glasses by translating visual stimuli into distinct sound icons, creating an auditory representation of objects within the wearer’s field of view.
In a study led by Dr. Howe Zhu from the University of Technology Sydney, the effectiveness and usability of the acoustic touch technology were investigated. Fourteen participants, including individuals with and without visual impairments, were involved in the study. The results showed that the wearable device equipped with acoustic touch technology significantly improved blind or low-vision individuals’ ability to recognize and reach for objects without excessive mental effort.
Dr. Zhu suggests that this advancement has significant potential as a wearable and efficient method of sensory enhancement for the visually impaired community, enabling users to accurately identify and reach for objects.
The research sheds light on the critical role of assistive technology in addressing the everyday challenges faced by individuals with blindness or low vision, particularly in locating specific household items and personal belongings. By overcoming these hurdles, the acoustic touch technology opens up new possibilities for greater independence and improved quality of life for this demographic.
“Distinguished Professor Chin-Teng Lin from the University of Technology Sydney, a global leader in brain-computer interface research, explains that traditional smart glasses rely on computer vision and other sensory information to convert the wearer’s surroundings into synthesized speech. In contrast, the auditory feedback provided by acoustic touch empowers users to identify and reach for objects with remarkable accuracy,” said Professor Lin.
The study’s findings were published in the journal PLOS One, further highlighting the potential impact of this groundbreaking technology. As assistive technology continues to evolve, it holds the promise of transforming the daily lives of individuals with sensory disabilities.
© 2023 TECHTIMES.com. All rights reserved. Reproduction without permission is prohibited.

I have over 10 years of experience in the cryptocurrency industry and I have been on the list of the top authors on LinkedIn for the past 5 years. I have a wealth of knowledge to share with my readers, and my goal is to help them navigate the ever-changing world of cryptocurrencies.