Robotics And Audio Signals Help Blind Navigate The Unknown

Aug. 16, 2006
A new system based on robotics technology and audio signals is helping the blind to “see” their surrounding environment. Georgia Tech researchers are developing a wearable computing system called the System for Wearable Audio Navigation (SWAN) designed to

A new system based on robotics technology and audio signals is helping the blind to “see” their surrounding environment. Georgia Tech researchers are developing a wearable computing system called the System for Wearable Audio Navigation (SWAN) designed to help the visually impaired—including firefighters and soldiers—navigate their way in unknown territory. The SWAN system, consisting of a small laptop, a proprietary tracking chip, and bone-conduction headphones, provides audio cues to guide the person from place to place, with or without vision (see photo).

Frank Dellaert, assistant professor in the Georgia Tech College of Computing and Bruce Walker, assistant professor in Georgia Tech’s School of Psychology and College of Computing, combined their respective areas of expertise—determining the location of robots and audio interfaces—in a project to assist the blind. Dellaert’s artificial intelligence research focuses on tracking and determining the location of robots and developing applications to help robots determine where they are and where they need to go. There are similar challenges when it comes to tracking and guiding robots and people. Dellaert’s robotics research usually focuses on military applications since that is where most of the funding is available.

The researchers are able to localize a person outdoors with GPS data, and have a working prototype that uses computer vision to see street-level details not included in GPS, such as light posts and benches. The biggest challenge is integrating all the information from all the various sensors in real time to accurately guide the user as they move toward their destination.

Walker’s expertise in human computer interaction and interface design includes developing auditory displays that indicate data through sonification or sound. He explains that “by using a modular approach in building a system useful for the visually impaired, we can easily add new sensing technologies, while also making it flexible enough for firefighters and soldiers to use in low visibility situations.” The researchers need to design sound beacons that are easily understood by the user but not annoying or in competition with other sounds they need to hear, such as traffic noise.

The current SWAN prototype consists of a small laptop computer worn in a backpack, a tracking chip, additional sensors including GPS, a digital compass, a head tracker, four cameras and light sensor, and special headphones called bone phones. The researchers selected bone phones because they send auditory signals via vibrations through the skull without plugging the user’s ears, an especially important feature for the blind who rely heavily on their hearing. The sensors and tracking chip worn on the head send data to the SWAN applications on the laptop, which computes the user’s location and in what direction he is looking, maps the travel route, then sends 3D audio cues to the bone phones to guide the traveler along a path to the destination.

The 3D cues sound like they are coming from about 1 meter away from the user’s body, in whichever direction the user needs to travel. Three-dimensional audio, a well-established sound effect, is created by taking advantage of humans’ natural ability to detect inter-aural time differences. It schedules sounds to reach one ear slightly faster than the other, and the human brain uses that timing difference to figure out where the sound originated. SWAN would supplement other techniques that a blind person might already use for getting around such as using a cane to identify obstructions in the path or a guide dog.

The researchers’ next step is to transition SWAN from outdoors-only to indoor-outdoor use. Since GPS does not work indoors, the computer vision system is being refined to bridge that gap. Also, the research team is currently revamping the SWAN applications to run on PDAs and cell phones, which will be more convenient and comfortable for users. The team plans to add an annotation feature so that a user can add other useful annotations to share with other users such as nearby coffee shops, a location of a puddle after recent rains, and perhaps even the location of a park in the distance. There are plans to commercialize the SWAN technology after further refinement, testing, and miniaturizing of components.

Swan Project
http://sonify.psych.gatech.edu/research/swan/index.html

Sponsored Recommendations

Highly Integrated 20A Digital Power Module for High Current Applications

March 20, 2024
Renesas latest power module delivers the highest efficiency (up to 94% peak) and fast time-to-market solution in an extremely small footprint. The RRM12120 is ideal for space...

Empowering Innovation: Your Power Partner for Tomorrow's Challenges

March 20, 2024
Discover how innovation, quality, and reliability are embedded into every aspect of Renesas' power products.

Article: Meeting the challenges of power conversion in e-bikes

March 18, 2024
Managing electrical noise in a compact and lightweight vehicle is a perpetual obstacle

Power modules provide high-efficiency conversion between 400V and 800V systems for electric vehicles

March 18, 2024
Porsche, Hyundai and GMC all are converting 400 – 800V today in very different ways. Learn more about how power modules stack up to these discrete designs.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!