Deprived of sight, blind people manage to squeeze an amazing amount of information out of their other senses. Doing this requires their brains to do some reorganising. To learn about some of these changes, scientists studied the brains of blind people who've learned to use an augmented reality system that converts images into soundscapes.
The system was invented in the early '90s, but it's not widely used. The way it works is a person puts on a pair of goggles with a built-in camera and software that converts images captured by the camera into sounds. For example, the pitch of the sound (high or low) indicates the vertical position of an object; the timing and duration of the sound indicate the object's horizontal position and width (you can see and hear a demo of a similar technology here). For real world scenes, the sounds are complex -- in fact, they sound a bit like a garbled transmission from an alien spacecraft.
But with enough practice people can learn to interpret the sounds and form a mental image of objects -- including people -- that appear in front of them.
When sighted people see an outline or silhouette of a human body, areas of the cerebral cortex that specialise in making sense of visual stimuli become active. One of these, the extrastriate body area, seems particularly interested in bodies: it responds more strongly to images of the human body than to other types of objects.
But blindness cuts off the usual flow of information from the eyes to this part of the brain, and people who've been blind since birth have never actually seen a human form. Something must change in their brains when they learn to perceive body shapes using sound. Do visual parts of the brain start responding to sounds? Or do auditory parts of the brain start responding to body shapes? It's a neat trick either way.
To find out what really happens, Ella Striem-Amit and Amir Amedi of the Hebrew University of Jerusalem scanned the brains of seven congenitally blind people who'd trained for an average of 73 hours on the augmented reality system. After training, they achieved 78 percent accuracy at classifying three different types of objects: people, everyday objects (like a cellphone), or textured patterns.
In some cases, they could do even more. "During training, the participants were asked to report the body posture of the people in the images they 'saw,' and could verbally describe it quite well, and also mimic it themselves," Striem-Amit said.
Striem-Amit and Amedi also found that in blind people as well as sighted people, body shapes also activated an area called the temporal-parietal junction, which some researchers think is involved in figuring out the intentions of other people.
The study illustrates that the brain can be remarkably malleable, says Kalanit Grill-Spector, a neuroscientist at Stanford University. When blind people learn to read Braille, their visual cortex becomes sensitive to touch, she notes. "However, there has been little evidence for auditory stimuli driving responses in visual cortex in the blind," Grill-Spector said. "For example making human sounds such as clapping or laughing does not seem to activate visual cortex in the blind."