The next big technological pole vault in mainstream eyewear is probably the widespread adoption of something like Google Glass. But right now I want to talk about a really cool niche eyewear development that’s likely to get less buzz than Google Glass, even though it could prove much more revolutionary for the wearer: sensory substitution equipment.
Specifically, I’m talking about sensory substitution for people with visual impairment. Sensory substitution is the use of one of the body’s senses to detect data that would usually be sampled by a different sense—so, for example, colors for tastes, or tactile pressure for sound. One way of doing this to replace vision is by using what several reports have called “sonar vision” glasses, which sound like something you’d find at Wayne Enterprises R&D. But these glasses do something even cooler than anything in Batman’s utility belt: They allow blind people to recognize physical images—to “see”—through sound.
The set-up looks like this: A pair of glasses bears a small camera across the bridge. You put them on your face – no surprise there. Meanwhile, visible-spectrum data from the camera is continuously wired off to a nearby laptop or smartphone, where the crucial software is housed. This software uses an algorithm to convert patterns of light into patterns of sound. That’s the hard part, and the computer does it for you. You simply plug in your ear buds, and you’re ready to look around.
Here’s one way it can work, as demonstrated in a fantastic talk (embedded below) at TEDxJerusalem by Dr. Amir Amedi , a scientist at the Hebrew University of Jerusalem in Israel:
As the glasses’ wearer looks at an image, the camera scans from left to right, and plays tones of sound to synchronize with the horizontal sampling. This creates one axis of data: Time equals horizontal position. So if you were to use the glasses to look at this string of characters:
…you would hear a single tone, a pause, and then the same tone. If the space in between the dashes were wider, the pause would be longer, and if the dashes were longer, the tones would play longer. The algorithm creates another axis of data by using pitch. If you were to look at this, from left to right:
…you would hear a high-pitched tone, a pause, and then a lower-pitched tone. And if you were to scan this:
| | /
…you’d hear a brief chord of many pitches at the same time, then a long pause, then the same chord, a shorter pause, and then a sudden ascending scale (the slash). From here, you can add more elements to represent more types of data. Volume can correspond to brightness (louder is brighter). Et cetera. I know it sounds complicated at first, but if you watch the video, you’ll be amazed how quickly you start to hear the pictures.
Be warned: Some sources have used potentially confusing terminology in reporting on technology like this. “Sonar vision glasses” do not constitute a true system of “sonar” in the technical sense of the word. A sonar system uses sound waves to gather information about physical objects. For example, a nuclear submarine’s active sonar system pumps out a short burst of sound called a “ping,” and then listens for the ping to bounce off of something and echo back. These echoes give the submarine commanders information about the position and movement of other objects in the area—like whales, or underwater rock formations, or enemy torpedoes.
“Sonar vision” glasses, on the other hand, are almost the exact opposite of sonar. Instead of turning sound into an image, they turn an image into sound.
However, you could argue—and this is the coolest part—that with the help of the glasses, the wearer turns his or her own brain into something like a sonar system. Seriously, think about it: Sound goes into your brain; image forms. After about 70 hours of practice with glasses like these, blind adults in the lab were able to, for example, judge where another person was positioned in a room, or distinguish between facial expressions. One of the most exciting findings in this area came by pairing fMRI with sensory substitution technology. With the help of sonar vision glasses, even congenitally blind adults who had never seen a single thing with their eyes in their whole lives showed activity in the visual cortex when hearing pictures through the system. In other words, even without functioning eyes, the blind person’s brain can still attempt to “see”—and as technologies like these get better and better, it may have a better and better “picture” to look at.
Amedi’s TEDx talk: