Deprived of sight, blind people manage to squeeze an amazing amount of information out of their other senses. Doing this requires their brains to do some reorganizing. To learn about some of these changes, scientists studied the brains of blind people who’ve learned to use an augmented reality system that converts images into soundscapes.
The system was invented in the early ’90s, but it’s not widely used. The way it works is a person puts on a pair of goggles with a built-in camera and software that converts images captured by the camera into sounds. For example, the pitch of the sound (high or low) indicates the vertical position of an object; the timing and duration of the sound indicate the object’s horizontal position and width (you can see and hear a demo of a similar technology here). For real world scenes, the sounds are complex — in fact, they sound a bit like a garbled transmission from an alien spacecraft.
But with enough practice people can learn to interpret the sounds and form a mental image of objects — including people — that appear in front of them.