Ten healthy subjects wearing blindfolds were given solely auditory stimulation in the absence of visual stimulation. In a separate session, retinotopic mapping.
University of Glasgow scientists studying brain process involved in sight have discovered that the visual cortex also uses information gleaned from the ears when viewing the world.
They suggest this auditory input enables the visual system to predict incoming information and could confer a survival advantage.
“Sounds create visual imagery, mental images, and automatic projections,” said Professor Lars Muckli, of the University of Glasgow’s Institute of Neuroscience and Psychology, who led the research. “For example, if you are in a street and you hear the sound of an approaching motorbike, you expect to see a motorbike coming around the corner.”
The study, published in the journal Current Biology (open access), involved conducting five different experiments using functional Magnetic Resonance Imaging (fMRI) to examine the activity in the early visual cortex in 10 volunteer subjects.
In one experiment they asked the blindfolded volunteers to listen to three different sounds: birdsong, traffic noise and a talking crowd. Using a special algorithm that can identify unique patterns in brain activity, the researchers were able to discriminate between the different sounds being processed in early visual cortex activity.
A second experiment revealed that even imagined images, in the absence of both sight and sound, evoked activity in the early visual cortex.
“This research enhances our basic understanding of how interconnected different regions of the brain are,” Muckli said. “The early visual cortex hasn’t previously been known to process auditory information, and while there is some anatomical evidence of interconnectedness in monkeys, our study is the first to clearly show a relationship in humans.
“This might provide insights into mental health conditions such as schizophrenia or autism and help us understand how sensory perceptions differ in these individuals.”
Scientists at the Massachusetts Institute of Technology are developing an audio reading device to be worn on the index finger of people whose vision is impaired, giving them affordable and immediate access to printed words.
The so-called FingerReader, a prototype produced by a 3-D printer, fits like a ring on the user’s finger, equipped with a small camera that scans text. A synthesized voice reads words aloud, quickly translating books, restaurant menus and other needed materials for daily living, especially away from home or office.
Reading is as easy as pointing the finger at text. Special software tracks the finger movement, identifies words and processes the information. The device has vibration motors that alert readers when they stray from the script, said Roy Shilkrot, who is developing the device at the MIT Media Lab.
For Jerry Berrier, 62, who was born blind, the promise of the FingerReader is its portability and offer of real-time functionality at school, a doctor’s office and restaurants.
“When I go to the doctor’s office, there may be forms that I wanna read before I sign them,” Berrier said.
He said there are other optical character recognition devices on the market for those with vision impairments, but none that he knows of that will read in real time.