pixels and pictures
12.8K views | +4 today
Follow
pixels and pictures
Exploring the digital imaging chain from sensors to brains
Your new post is loading...
Your new post is loading...
Scooped by Philippe J DEWOST
Scoop.it!

Newly Launched EyeNetra Mobile Eye-Test Device Could Lead To Prescription Virtual-Reality Screens

Newly Launched EyeNetra Mobile Eye-Test Device Could Lead To Prescription Virtual-Reality Screens | pixels and pictures | Scoop.it

After five years of development and about 40,000 tests worldwide, the smartphone-powered eye-test devices developed by MIT spinout EyeNetra is coming to hospitals, optometric clinics, optical stores, and even homes nationwide.

But on the heels of its commercial release, EyeNetra says it’s been pursuing opportunities to collaborate with virtual-reality companies seeking to use the technology to develop “vision-corrected” virtual-reality displays.

“As much as we want to solve the prescription glasses market, we could also [help] bring virtual reality to the masses,” says EyeNetra co-founder Ramesh Raskar, an associate professor of media arts and sciences at the MIT Media Lab who co-invented the device.

The device, called Netra, is a plastic, binocular-like headset. Users attach a smartphone, with the startup’s app, to the front and peer through the headset at the phone’s display. Patterns, such as separate red and green lines or circles, appear on the screen. The user turns a dial to align the patterns and pushes a button to lock them in place. After eight interactions, the app calculates the difference between what the user sees as “aligned” and the actual alignment of the patterns. This signals any refractive errors, such as nearsightedness, farsightedness, and astigmatism. The app then displays the refractive powers, axis of astigmatism, and pupillary distance required for eyeglasses prescriptions.

In April, the startup launched Blink, an on-demand refractive test service in New York, where employees bring the startup's optometry tools, including the Netra device, to people’s homes and offices. In India, EyeNetra has launched Nayantara, a similar program to provide low-cost eye tests to the poor and uninsured in remote villages, far from eye doctors. Both efforts used EyeNetra’s suite of tools, now available for eye-care providers worldwide.

According to the World Health Organization, uncorrected refractive errors are the world’s second-highest cause of blindness. EyeNetra originally invented the device for the developing world — specifically, for poor and remote regions of Africa and Asia, where many people can’t find health care easily. India alone has around 300 million people in need of eyeglasses.

Philippe J DEWOST's insight:

Interesting crossroads between VR and healthcare, and a sound reminder of how incredibly powerful smartphones have become !

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Low-cost 'nano-camera' developed that can operate at the speed of light | NDTV Gadgets

Low-cost 'nano-camera' developed that can operate at the speed of light | NDTV Gadgets | pixels and pictures | Scoop.it

Researchers at MIT Media Lab have developed a $500 "nano-camera" that can operate at the speed of light. According to the researchers, potential applications of the 3D camera include collision-avoidance, gesture-recognition, medical imaging, motion-tracking and interactive gaming.


The team which developed the inexpensive "nano-camera" comprises Ramesh Raskar, Achuta Kadambi, Refael Whyte, Ayush Bhandari, and Christopher Barsi at MIT, and Adrian Dorrington and Lee Streeter from the University of Waikato in New Zealand.

 

The nano-camera uses the "Time of Flight" method to measure scenes, a method also used by Microsoft for its new Kinect sensor that ships with the Xbox One. With this Time of Flight, the location of objects is calculated by how long it takes for transmitted light to reflect off a surface and return to the sensor. However, unlike conventional Time of Flight cameras, the new camera will produce accurate measurements even in fog or rain, and can also correctly locate translucent objects.

Philippe J DEWOST's insight:

Meet the nano-camera, the $500 little sister of 2011 $500.000 femto-camera...

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Scientists reconstruct speech through soundproof glass by watching a bag of potato chips

Scientists reconstruct speech through soundproof glass by watching a bag of potato chips | pixels and pictures | Scoop.it

Your bag of potato chips can hear what you're saying. Now, researchers from MIT are trying to figure out a way to make that bag of chips tell them everything that you said — and apparently they have a method that works. By pointing a video camera at the bag while audio is playing or someone is speaking, researchers can detect tiny vibrations in it that are caused by the sound. When later playing back that recording, MIT says that it has figured out a way to read those vibrations and translate them back into music, speech, or seemingly any other sound.

Philippe J DEWOST's insight:

Throw your bag of chips before engaging in a confidential conversation. And avoid any line of sight.

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

MIT's Halide programming language can dramatically speed imaging processing

MIT's Halide programming language can dramatically speed imaging processing | pixels and pictures | Scoop.it

A new programming language for image-processing algorithms yields code that runs much faster, reports the Massachusetts Institute of Technology — and this could lead to much better in-camera performance in dedicated devices and smart phones.

more...
No comment yet.