From the Book: NeuroLogic by Eliezer J. Sternberg. Copyright © 2015 by Eliezer J. Sternberg, published by arrangement with Pantheon Books, an imprint of The Knopf Doubleday Publishing Group, a division of Random House LLC.
"I can figure out what things look like through other means," Amelia, who was born blind, tells me. She's describing the way she creates a mental picture of her environment.
"As I'm walking down a hallway, I can picture it. By the echoes of my heels against the ground, I know that it's a marble corridor. I can tell how long and wide it is. I sense whether the hall is congested with people or if it's empty. I'm aware of all the other footsteps. I feel the slight whoosh of air as someone passes me."
The echo of her heels against the marble changes as she enters the building's main lobby. "I can sense the grandeur of the atrium," she says. "This is clearly a fancy tower." Even in the absence of vision, Amelia can picture her surroundings by integrating her other sensations. Her brain exploits the interconnectedness of its various sensory pathways to reconstitute her vision using nonvisual means. Despite being blind, Amelia can appreciate the dimensions of a hallway, assess how crowded it is, detect the position of people around her, and even sense the elegance of the building she's in. She can navigate her environment using a nonvisual mental map.
I closed my eyes and tried to imagine what it might be like to perceive the world as Amelia does, but visual images kept popping into my mind. I wondered whether her concept of perceiving a corridor of sound was something like the way bats sense their environments using echolocation, their biological sonar that works by detecting the reflections of sounds they emit. Apparently, I'm not the only one who noticed the parallel.
Blind since he was a baby, Daniel Kish founded World Access for the Blind, an organization aiming to help people confront and overcome their blindness by developing their other senses. Kish is particularly known for his ability to use his own form of echolocation. The technique involves rapidly clicking his tongue against the roof of his mouth and listening for how the sound reflects off walls, cars, people, or anything else in his environment.
"It is the same process bats use," Kish says. "If a person is clicking and they're listening to surfaces around them they do get an instantaneous sense of the positioning of these surfaces." By carefully listening for the echo, Kish can discern even subtle differences between materials: "For example, a wooden fence is likely to have thicker structures than a metal fence and when the area is very quiet, wood tends to reflect a warmer, duller sound than metal."
Using fMRI, researchers in Canada were able to peer into the brains of people who use this human echolocation. Two blind individuals who were trained in the technique and two sighted control subjects participated in the study. All four volunteers first sat in a chamber specially designed so that echoes do not occur. The researchers monitored their brain activity while they tried the technique in the anti-echoic chamber. The purpose of this was to determine the baseline brain activation, to map the signals triggered by the activity of just hearing one's own clicks so as to eventually subtract that from the final results. In the next phase of the experiment, the control subjects were blindfolded, and along with the blind volunteers they tried echolocating outside near trees, cars, or lampposts. All the while, tiny microphones implanted in their ears recorded the sounds they heard. In the final stage of the experiment, the participants entered the fMRI machine one by one, where they listened to the recordings of their own echolocation attempts.
To generate the results, the researchers subtracted out the effects of hearing participants' own clicks from each of their brain activity patterns measured by fMRI, leaving only the neurological response to the echoes. The brains of the sighted volunteers showed almost no additional activity. As expected, they were just hearing their own clicks. The results for the blind group, on the other hand, were astonishing. As they heard the recordings of their tongues clicking, the fMRI revealed activation in the visual cortex: They weren't just hearing echoes of clicking sounds. Their brains were listening to sounds and translating them into a visuospatial map of the environment.
Despite being unable to see, blind people don't stop using their occipital lobes. The purpose of vision is to navigate our environment. The purpose is survival. Even when visual input is cut off, the occipital lobe keeps trying to be our compass, processing spatial information through other means. The brain constructs our picture of the world by piecing together whatever information it has, even crossing sensory boundaries—and not just those of vision and hearing.
In 2013, neuroscientists in Denmark published a study looking at how the brain allows for navigation when vision is deactivated. The experiment required that the participants find their way through a virtual corridor using only their sense of touch . . . on their tongues. They used a device called the tongue display unit, which created a tactile map by stimulating the tongue whenever the user bumps into a wall of the maze. Subjects could navigate the maze using arrow keys on a computer. The challenge was to use trial and error to find their way through. Participants might first try going straight until they hit a dead end, which they could feel as a buzzing sensation, and then have to figure out which way to turn, all the while constructing a map of the maze in their minds.
The neuroscientists trained two groups of participants to use the tongue display unit: congenitally blind subjects and sighted but blindfolded controls. As neuroscientists tend to do, they watched the participants' brains with fMRI as both the blind and the blindfolded subjects maneuvered through the virtual maze.
The fMRI results looked just like those of the human echolocation study. The blind subjects, who had never perceived a photon of light in their lives, were firing all cylinders of their visual cortices during the tongue stimulation task. Their brains had translated tactile signals into a visuospatial map. The blindfolded subjects exhibited no such activity. Their visual cortices remained quiet. In fact, when the sighted subjects stripped off their blindfolds and navigated the maze with their eyes, their brain activity matched that of the blind subjects navigating with their tongues.
Whether it's from our eyes, ears, or tongue, the brain will accept whatever sensory information it can get to construct a model of the world around us. Though members of the blind community lose the ability to see with their eyes, they can still generate a picture of the world through other means. Imagine how much more prominent these intersections are in the brains of blind people, who depend on sensory cross talk to substitute for their visual deficit. Their unconscious system can reprogram the visual cortex by remodeling the sensory highway system, pixelating the world around them by interweaving their other senses. They keep their sense of navigation and spatial relationships. They can enhance their use of one sense to fill the gaps of another. They retain their ability to imagine and to dream.
By Eliezer Sternberg, from the book, Neurologic.