You can follow Sarah on Twitter @sfhillenbrand
--
Despite the old adage, bats aren’t actually blind. They are, however, nocturnal, which means they must hunt in the dark. They can swoop at high speeds, around obstacles, and even through holes in netting—whatever it takes to nab a tasty fly in midair. Their precise hunting skills stem from their ability to echolocate. Bats navigate the skies by making a chirping sound and then listening to the echoes that bounce off of their surroundings. By measuring the delay between a chirp and its echoes, and comparing the delays across their two ears, bats can infer where the echoes came from. The chirps are way up in the ultrasonic range, too high-pitched for human ears to hear. However, the short wavelength of these high-frequency sound waves is key, allowing the echoes to return with high-resolution information about the shape of the bat’s environment.
The human auditory system comes equipped with some of these abilities. For instance, you can easily tell whether a person calling your name is doing so from the left or the right. But humans are primarily visual creatures—roughly a third of the human brain is devoted to the processing of visual information. Most humans rely on visual, rather than auditory, processing capabilities to gain information about the spatial layout of the world around them.
Blind people, on the other hand, must rely on their other senses. Typically this means using hearing and touch; however, some blind people have picked up a new sense to help them navigate their world. Borrowing a page from the bat playbook, they have learned to make clicks with their tongues or fingers and to extract information from the subtle echoes that bounce back. Constant clicking provides a continuous flow of information, granting these human echolocators greater mobility and independence. This technique has its stunning success stories, although the reality is that these are the exception, not the rule.
A team of scientists from UC Berkeley believes it can improve human echolocation with an invention it calls the Sonic Eye. The device emits ultrasonic chirps much like those used by bats, then slows down the echoes so that the user can hear and interpret them. If the current prototype can be made practical for everyday use, it could greatly improve the lives of many of the 40 million blind people living in the world today. It could also act as a tool in research, providing scientists with a completely new way of rewiring the brain’s sensory processing circuits. Surprisingly, it is a side project, the result of a series of speculative conversations and chance encounters between a handful of current and former UC Berkeley PhD students and postdocs.
Jascha Sohl-Dickstein, Nicol Harper, Santani Teng, and Benji Gaub drew inspiration from both bats and human echolocators, combining the best of both worlds to create the Sonic Eye. In the process, they have provided a case study in cross-disciplinary collaboration. Their journey has seen them pushing through the rigors of academic research one day and tap dancing through the fast-paced world of business the next. As funding for scientific research becomes ever scarcer, researchers have begun to stray from the well-trodden paths of traditional academic research. They are getting creative, seeking resources and support in increasingly diverse arenas. The Sonic Eye is not your typical research project, but this team’s twists and turns through academia and industry may become more common for research projects in the future.
Jascha Sohl-Dickstein and Nicol Harper, both former researchers at the Redwood Center for Theoretical Neuroscience at UC Berkeley, are the Sonic Eye’s proud parents. After attending a talk on how the brain detects statistical patterns in sounds, they began to speculate about the neural mechanisms behind human echolocation.
How do human echolocators acquire their skill, and why are some better at it than others? Some have learned from Daniel Kish, perhaps the most well-known human echolocator as well as an avid hiker and mountain biker. His organization, World Access for the Blind, works to give blind individuals a higher quality of life by teaching them to echolocate. Nevertheless, his methods remain controversial among advocacy groups for blind people. Many recommend the tried-and-true white cane over echolocation to avoid sowing false hopes. Some have more success with echolocation than others, and nobody is quite sure why. Do some fear the social stigma of constant clicking and give up? Do some produce more precise clicks with their tongue or spend more time outside practicing? Or are some just more bat-brained than others?
Harper, who studies how the brain processes sound, scoured the bat brain literature to size up the problem. He turned up only a few small ways in which the auditory system of the bat—or of the dolphin, another echolocating mammal—differed from that of humans. Bats have selective hearing, but so do humans. Bats have special delay-detecting neurons with superfast conduction, but speed alone was an incomplete explanation. The neuroanatomy appeared surprisingly similar. It seemed reasonable, then, that bats’ specialized chirps and satellite dish-shaped outer ears, or pinnae, were the primary source of their powers. The flip side of this idea was the tantalizing suggestion that humans might be able to reach batlike levels of echolocating prowess if they were outfitted with the right equipment.
Most bat species are excellent echolocators. Their large external ears, or pinnae, aid in their acoustic behavior. Credit: Flickr user furryscaly
Harper and Sohl-Dickstein’s next step was to try to bestow these same advantages upon humans. They borrowed some equipment from UC Berkeley Professor of Physics Mike DeWeese, whose research focuses on auditory perception. After rejecting a few initial designs, they mounted headphones, a microphone, and a set of speakers to a helmet like those for BMX bikes for kids. They stuffed an old computer running signal processing algorithms into an infant carrier backpack. Finally, they tacked on a pair of handmade clay pinnae, designed to mimic bat ears, and the Sonic Eye was born.
The Sonic Eye belongs to a class of technologies known as sensory substitution devices (SSDs). The majority of SSDs help blind people by taking in visual information and translating it into auditory or tactile information. Some users are remarkably adept at using this information—one blind SSD user famously climbed Mt. Everest by “seeing” through electrical stimulation on the tongue.
However, unlike most SSDs—which heavily process the incoming signal—the Sonic Eye leaves echoes in a relatively raw form. Instead of trying to replace vision with feats of engineering, the Sonic Eye provides richly informative sensory input and relies on the incredible adaptability of the human brain to do the rest. This approach is informed in part by Sohl-Dickstein’s computer models of how our brains learn to represent the world. “The majority of the brain’s representations are learned from sensory inputs, rather than being hard-wired in during fetal development,” says Sohl-Dickstein. “The brain learns to interpret the statistical patterns it finds in these inputs.”
The brain’s ability to reorganize itself in this way is called neuroplasticity, and it plays a more central role in sensory processing than most people think. We tend to take our senses for granted, assuming that we don’t learn to see or hear the same way we learn to read, write, or ride a bicycle. But even our most basic sensory capabilities require input to wire up properly. The senses are so dependent on experience that it seems like acquiring a new sense could be as simple as feeding the brain the right signal.
When used in conjunction with neuroimaging methods, SSDs like the Sonic Eye provide a unique opportunity to empirically test the limits of neuroplasticity. Neuroimaging experiments have already demonstrated that the visual cortex of blind people is active when reading Braille. Their “unemployed” visual cortex is also recruited for other nonvisual and even cognitive tasks, revealing that its loyalty to vision is highly negotiable. Its predilection for extracting certain types of information, on the other hand, is comparatively set in stone. Certain aspects of brain organization, such as which “visual” areas process features like shape, motion, or even learned percepts like letters or body parts, show up in the brains of blind SSD users. Tactile or auditory stimulation provided by SSDs can convey the same spatial information in a different format, and the brain is capable of picking up and running with it.
Benji Gaub demonstrates the Sonic Eye prototype. Credit: Sarah Hillenbrand
When Santani Teng saw a demonstration of the Sonic Eye in use at a Redwood Center seminar, he saw a promising technique for spurring neuroplasticity. He joined the team, eager for a chance to literally mold minds and then test their newfound abilities in a laboratory setting. As part of his PhD research in psychology at UC Berkeley, Teng had already studied human echolocators. “I was in a vision lab, and didn’t know anything about the auditory system when I started,” he says. “But it seemed like echolocation wasn’t just a special case of hearing, because object detection and localization are usually done using vision.”
Functional magnetic resonance imaging (fMRI) has recently been used to demonstrate that the visual cortex of blind people is used for echolocation, just as it is for Braille. However, the details of how this rewiring occurs remain a mystery. One clue is that a map of space in the visual cortex, like the one produced by visual inputs in the sighted brain, was only found in an echolocator whose vision was lost earlier in life. This could mean that older brains, whose visual systems have already wired up, are like the proverbial old dog who cannot learn new tricks.
Using the Sonic Eye, Teng hopes he can precisely control participants’ echolocating practice time instead of seeking out the rare proficient human echolocators for experiments. Eventually, he would like to use fMRI to find out which properties of the visual system emerge in the blind brain when it is outfitted with the power of high-resolution echolocation. This could help illuminate the constraints on neuroplasticity that determine who can and cannot learn to echolocate. For now, when novice echolocators ask Teng whether he thinks they can become experts, he is cautious. “I’m sort of the Eeyore of the group,” Teng says, “so I tell them that whatever the perceptual benefits are, practicing echolocation forces them to practice independence.”
In a first test for the Sonic Eye, blindfolded sighted people listened to pre-recorded echoes that had bounced off of a plastic plate located somewhere in front of them. Without any feedback about their performance, they quickly distinguished left from right using their normal sound localization ability. With a bit of feedback, they learned to distinguish near from far. The echoes take longer to return from faraway objects, and this delay is recognizable when the Sonic Eye stretches the echoes in time. Sohl-Dickstein and the team are in the process of publishing the results of these preliminary sound localization experiments as a proof of concept.
Of the three dimensions, however, not all were perceived equally well—elevation proved to be troublesome. Blindfolded volunteers had a hard time discriminating high from low in the 20 trials they completed, but Harper got the hang of it after about six hours of self-guided practice. These judgments rely on the pinnae, which filter high and low frequencies differently depending on where the sound originated. For example, try snapping your fingers behind your head and then in front of your face—the sound is higher, sharper, or clearer in front.
Click image to enlarge. Credit: Natalia Bilenko; helmet: Liikenneturva
The prototype’s clay pinnae may require a redesign, an acclimation period, or both. The results of one eyebrow-raising study set a precedent for Harper’s acclimation period. Volunteers wore elf-like artificial pinnae that disrupted their sound localization ability. They regained normal sound localization ability after about six weeks. Despite this substantial adjustment period, their hearing returned to normal as soon as the artificial pinnae were removed. The absence of an after-effect implies that the underlying plasticity is switchable, more like learning a new language than undergoing some irreversible sensory conversion. This is good news for would-be Sonic Eye users who may want to stow their bat ears when not echolocating.
Next, Teng plans to find out whether the Sonic Eye can help people identify the shapes of objects. With engineer Alex Maki-Jokela, he and the team have been fine-tuning the device’s software to improve the user’s experience. The first formal experiments with blind test subjects, however, may still be far in the future. Academic research can be slow, particularly in collaborative projects like this one. Given the practical value of the device, the team wanted to find a way to get it into the hands of blind users quickly, and on a much bigger scale. They inevitably found their way to academia’s fast-talking cousin: industry.
Click image to enlarge. Credit: BrainPort: Wicab, Inc.; VibroGlove: Sethuraman Panchanathan at Arizona State University's Center for Cognitive Ubiquitous Computing; vOICe: Peter Meijer (www.seeingwithsound.com); EyeMusic: Amir Amedi's lab
The first step toward commercializing the Sonic Eye was to identify its niche in the SSD market. The information provided by the Sonic Eye is intuitively accessible, based on techniques originally developed by blind people. Other echolocation devices exist, but they tend to simplify the returning echoes. “Some devices try to do all the engineering for the brain,” says Sohl-Dickstein. “They throw away almost all of the information, and users are left with just an obstacle warning system.” While this simplification may make these devices easier to use, it also makes them less useful.
The Sonic Eye bets on neuroplasticity, preserving the rich detail of the unprocessed signal for possible use. How much of this richness can be used is an open question, but the team’s bets are hedged—humans’ natural sound localization ability lays a solid foundation for learning the basics of Sonic Eye use. We are practiced at extracting spatial information from sounds’ arrival times. In contrast, an SSD called the vOICe uses a translation that is ingenious but unnatural. It scans the visual scene with a camera, translating brightness, vertical location, and horizontal location into the loudness, pitch, and time of a soundscape. These soundscapes are very useful for identifying simple shapes and letters, but become overwhelming in visually cluttered environments.
In the day-to-day life of a blind person, the primary use of the Sonic Eye would be spatial navigation, while other devices like the vOICe could be used for tasks like reading. Given the ubiquity of smart phones, it seems likely that the all-in-one SSD of the future would integrate these capabilities as mobile apps. Like all SSDs, the Sonic Eye can never be a true replacement for vision. It’s not clear what it is—a form of seeing, hearing, or a unique neuroplastic fusion between the two–but to commercialize their creation, the team needed to prove it could be useful.
In 2012, the team gained two new members: Benji Gaub and Jeff Hawkins. Gaub is a PhD candidate in the Helen Wills Neuroscience Institute, developing gene therapy techniques for sight restoration. Hawkins is, among other things, the inventor of the Palm Pilot, and one of the most prominent figureheads in computational neuroscience today. Their collaboration began over a discussion of their mutual interests in the visual system and the rewiring that occurs in blindness. It was Hawkins who sparked Gaub’s interest in SSDs. At the suggestion of a labmate, Gaub contacted Sohl-Dickstein, and joined the Sonic Eye team. His research made him an asset to the team, but it was his business savvy, acquired through UC Berkeley’s Management of Technology certificate program, that pushed him into the role of team businessman.
Gaub returned to Hawkins, explaining the advantages of the device. Hawkins saw an opportunity to resurrect his interest in developing an SSD, and provided a goodwill donation of $5,000 to the project. He also signed on to mentor the team as they prepared a proposal for the 2013 Global Social Venture Competition (GSVC), which challenges aspiring entrepreneurs to turn their ideas into actual businesses with a positive impact on the world. The team threw themselves into writing up a business plan for the competition.
Gaub researched the SSD market, calculated the social return on investment in dozens of different ways, and contacted members of advocacy groups for blind people like LightHouse for the Blind and Visually Impaired in San Francisco, the Lions Center for the Blind in Oakland, and the East Bay Center for the Blind. Hawkins organized a brainstorming meeting at IDEO, a design and innovation consulting firm famed for developing Apple’s first mouse. The team sketched out the infrastructure needed to make the device usable: training materials, demonstrations, and technical support. Summarizing the Sonic Eye’s appeal for an audience of potential investors took them worlds away from the familiar arena of scientific grant writing, but Hawkins helped them find the hook: it is the 21st century and the best assistive technology available to blind people is a stick. We can do better.
The business plan required them to quantify the Sonic Eye’s potential utility in comparison to current technologies. For example, guide dogs cost around $42,000 to train, can come with a lengthy wait, and are expensive to feed and care for. Based on the average income and health insurance situation of potential users, the Sonic Eye could be expected to retail for around $4,000. Meanwhile, the cost of lost productivity due to blindness is an estimated four billion dollars annually in the United States alone. Using the Sonic Eye eliminates a common barrier to using echolocation: perceived social stigma. People don’t typically want to walk around clicking, as it they feel it gets in the way of social interaction. The inaudible ultrasonic chirps, however, would allow people to echolocate without drawing attention, opening up possibilities for jobs and social interactions.
In the end, the Sonic Eye reached the GSVC semifinals, and the team gained a wealth of new connections to pave the way toward commercialization. Despite these developments, the team is currently seeking a new addition: an electrical engineer to help develop a miniaturized prototype. The current bike helmet and backpack combination may be fetching, but by shrinking the setup down to be worn on a headband or baseball cap, the device can be more easily worn by test users as they go about their lives.
“One common pitfall for SSDs is to claim that they will restore sight,” says Gaub. “But many blind people, if given the opportunity, would not have their sight restored. They’ve learned to navigate their world one way, and would rather not re-learn from the ground up.” Joshua Miele, a scientist at the Smith-Kettlewell Eye Research Institute, articulates this problem well. Miele writes, “many sighted inventors of accessibility tools unconsciously substitute a combination of mythology, prejudice, and pity for market research. They come up with an idea that they think would be useful for blind people and leap into development mode, often without consulting even one informed blind consumer.” The Sonic Eye latches onto and extends a solution pioneered by blind individuals, taking user experience as the cornerstone of what works and what should be built upon.
The team’s forthcoming publication and their GSVC business plan each take the device’s pedigree up a notch. The publication documents the Sonic Eye’s three-dimensional sound localization sensitivity in precise detail, while the business plan provides would-be future investors with a document that details the risks and rewards in units of dollars, blind users, and whatever else they could think to ask about. This may allow the team to more readily seize any opportunities that knock on their door in the future. “Too often, ideas aren’t valued, or they don’t get implemented because it’s too hard,” says Gaub. “We want measurable successes. The question is, what path should we take to get there?”
On the academic side, rigorous experimental tests are needed, but allocating time to such a speculative, homegrown project is difficult. The current job market in academia favors publications that are the result of big ideas and even bigger grants. Furthermore, the team has been scattered to the wind. Gaub will soon graduate from UC Berkeley. Sohl-Dickstein has moved on to postdoctoral work at Stanford, Teng to MIT, and Harper to Oxford.
Increasingly, however, new technologies invented in university laboratories are finding their way to commercial markets. Engineering faculty in the Bay Area seem to have a startup for every gadget they’ve ever invented. The line between academia and industry has never been blurrier, and commercializing the Sonic Eye would mean faster access to blind test users. The Sonic Eye straddles this line, making progress along both tracks in fits and starts.
However, this project is ultimately a labor of love. The Sonic Eye team’s willingness to embrace the project’s multifaceted nature has led them down a third avenue: outreach. The Sonic Eye has been making the rounds in Berkeley classrooms and at festivals. Several semesters of the student-led course “Altered States and the Brain” have explored echolocation thanks to the Sonic Eye. The team brought the Sonic Eye to Maker Faire Bay Area, a festival billed as “the greatest show and tell on Earth.” The event showcases fantastic feats of engineering and whimsy, attracting an audience of curious minds. “People seem confused when they first put it on, but it’s cool to see how it makes more sense the more you listen,” says Sohl-Dickstein. “Sounds start to take on an identity.”
Philosophers have considered the title question of Thomas Nagel’s 1974 essay “What is it like to be a bat?” for four decades. Nagel concludes that it is ultimately impossible to know, so isolating is the nature of our subjectivity. In a footnote, however, he mentions human echolocators. If we knew what it was like to echolocate, he allows, we would at least be able to use our imaginations productively. When the Sonic Eye team straps their high-fidelity ultrasonic call-and-response system to a person with a healthy curiosity, the user peeks into a world of previously alien perceptions. This new experience satisfies a hunger to know how we learn about our world, what is innate, and what can be changed.
Whether the device answers any enduring mysteries of the mind remains to be seen, but in the meantime, the team receives something else that is hard to come by in academia: instant gratification. Gaub’s research on sight restoration is exciting, but is a much longer game. “The implementation of these therapies may be decades and millions of dollars away,” he says, “but I can put a smile on someone’s face today with the Sonic Eye.”
Featured image credit: design, Natalia Bilenko; original photo, Sarah Hillenbrand.
This article is part of the Fall 2014 issue.
Notice something wrong?
Please report it here.