Bridging the worlds of neuroscience and high-tech virtual realty, the Glass Brain, a project of the new Neuroscape Lab at the University of California San Francisco may open up new insights into the complicated mechanisms of the brain.
Researchers have developed a new way to explore the human brain through virtual reality. The system, called Glass Brain, initiated by Philip Rosedale, creator of the famous game Second Life, and Adam Gazzaley, a neuroscientist at the University of California San Francisco, combines brain scanning, brain recording and virtual reality to allow a user to journey through a person’s brain in real-time.
For a recent demonstration at the South by Southwest (SXSW) Interactive festival in Austin, Texas, Rosedale made his wife a cap studded with electroencephalogram (EEG) electrodes that measure differences in electric potential in order to record brain activity, while he wore a virtual reality headset to explore her brain in 3D, as flashes of light displayed her brain activity from the EEG.
The Glass Brain didn’t actually show what Rosedale’s wife was thinking, but Gazzaley’s team ultimately hopes to get closer to decoding brain signals and displaying them using the virtual reality system.
This is an anatomically-realistic 3D brain visualization depicting real-time source-localized activity (power and "effective" connectivity) from EEG (electroencephalographic) signals. Each color represents source power and connectivity in a different frequency band (theta, alpha, beta, gamma) and the golden lines are white matter anatomical fiber tracts. Estimated information transfer between brain regions is visualized as pulses of light flowing along the fiber tracts connecting the regions.
The modeling pipeline includes MRI (Magnetic Resonance Imaging) brain scanning to generate a high-resolution 3D model of an individual's brain, skull, and scalp tissue, DTI (Diffusion Tensor Imaging) for reconstructing white matter tracts, and BCILAB (http://sccn.ucsd.edu/wiki/BCILAB) / SIFT (http://sccn.ucsd.edu/wiki/SIFT) to remove artifacts and statistically reconstruct the locations and dynamics (amplitude and multivariate Granger-causal (http://www.scholarpedia.org/article/G...) interactions) of multiple sources of activity inside the brain from signals measured at electrodes on the scalp (in this demo, a 64-channel "wet" mobile system by Cognionics/BrainVision (http://www.cognionics.com)).
The final visualization is done in Unity and allows the user to fly around and through the brain with a gamepad while seeing real-time live brain activity from someone wearing an EEG cap.