Sensory neuroprostheses show great potential for alleviating major sensory deficits. It is not known, however, whether such devices can augment the subject’s normal perceptual range. Here we show that adult rats can learn to perceive otherwise invisible infrared light through a neuroprosthesis that couples the output of a head-mounted infrared sensor to their somatosensory cortex (S1) via intracortical microstimulation. Rats readily learn to use this new information source, and generate active exploratory strategies to discriminate among infrared signals in their environment. S1 neurons in these infrared-perceiving rats respond to both whisker deflection and intracortical microstimulation, suggesting that the infrared representation does not displace the original tactile representation. Hence, sensory cortical prostheses, in addition to restoring normal neurological functions, may serve to expand natural perceptual capabilities in mammals.
Perceiving invisible light through a somatosensory cortical prosthesis • Eric E. Thomson, Rafael Carra & Miguel A.L. Nicolelis
"we seek less complex, more versatile tools: accessible yet fundamentally adaptable. we believe these parameters are most directly achieved through minimalistic design, enabling users to more quickly discover new ways to work, play, and connect. we see flexibility not as a feature but as a foundation."
Tyler Laboratory of Neuroscience and Neurotechnology at the Virginia Tech Carilion Research Institute and the School of Biomedical Engineering and Sciences at Virginia Tech. We study the influence of mechanical forces on brain activity, develop tools and approaches to mapping functional brain activity in humans using ultrasound, and engineer neurotechnology for nonivasively stimulating brain activity using ultrasound, tDCS, and TMS while monitoring activity in human brain circuits using EEG, fMRI, and fNIRS.
The International Conference on New Interfaces for Musical Expression gathers researchers and musicians from all over the world to share their knowledge and ...
Colbert Sesanker's insight:
many of the shapes and limitations of our established musical interfaces no longer apply. the interface is now completely decoupled from the music, whose richest future, i believe lies in lambda calculus
great... Just because it has some effect on the nervous system does not mean it will have any positive qualitative effect one dreams up. At least tDCS, is vaguely based on the established idea of forced synchronization between two oscillators.
PubMed comprises more than 23 million citations for biomedical literature from MEDLINE, life science journals, and online books. Citations may include links to full-text content from PubMed Central and publisher web sites.
Colbert Sesanker's insight:
How? Another addition to the current movement of blind exploratory therapies ---We blindly direct every 'safe' form of electromagnetic radiation at people's brains and look to see if something good happens.--- I guess the next paper will be on sound.... oh wait!
"Temporary electronic tattoos could soon help people fly drones with only thought and talk seemingly telepathically without speech over smartphones"
"His team is developing wireless flexible electronics one can apply on the forehead just like temporary tattoos to read brain activity.
“We want something we can use in the coffee shop to have fun,” Coleman says.
The devices are less than 100 microns thick, the average diameter of a human hair. They consist of circuitry embedded in a layer or rubbery polyester that allow them to stretch, bend and wrinkle. They are barely visible when placed on skin, making them easy to conceal from others."
"Here we show that adult rats can learn to perceive otherwise invisible infrared light through a neuroprosthesis that couples the output of a head-mounted infrared sensor to their somatosensory cortex (S1) via intracortical microstimulation. Rats readily learn to use this new information source, and generate active exploratory strategies to discriminate among infrared sources in their environment."
"John Underkoffler led the team that came up with this interface, called the g-speak Spatial Operating Environment. His company, Oblong Industries, was founded to move g-speak into the real world. Oblong is building apps for aerospace, bioinformatics, video editing and more. But the big vision is ubiquity: g-speak on every laptop, every desktop, every microwave oven, TV, dashboard. "It has to be like this," he says. "We all of us every day feel that. We build starting there. We want to change it all."
Researchers have developed pioneering ‘tweezers’ that use ultrasound beams to grip and manipulate tiny clusters of cells, which could lead to life-changing medical advances, such as better cartilage implants that reduce the need for knee replacement operations. Using ultrasonic sound fields, cartilage cells taken from a patient’s knee can be levitated for weeks in a nutrient-rich fluid.
"The International Conference on New Interfaces for Musical Expression gathers researchers and musicians from all over the world to share their knowledge and late-breaking work on new musical interface design. The conference started out as a workshop at the Conference on Human Factors in Computing Systems (CHI) in 2001. Since then, an annual series of international conferences have been held around the world, hosted by research groups dedicated to interface design, human-computer interaction, and computer music."
PLOS ONE: "Realization of reaching and grasping movements by a paralytic person or an amputee would greatly facilitate her/his activities of daily living. Towards this goal, control of a computer cursor or robotic arm using neural signals has been demonstrated in rodents, non-human primates and humans. This technology is commonly referred to as a Brain-Machine Interface (BMI) and is achieved by predictions of kinematic parameters, e.g. position or velocity. However, execution of natural movements, such as swinging baseball bats of different weights at the same speed, requires advanced planning for necessary context-specific forces in addition to kinematic control. Here we show, for the first time, the control of a virtual arm with representative inertial parameters using real-time neural control of torques in non-human primates (M. radiata). We found that neural control of torques leads to ballistic, possibly more naturalistic movements than position control alone, and that adding the influence of position in a hybrid torque-position control changes the feedforward behavior of these BMI movements."