David Edwards has strong opinions on scent. His big theory: We don’t give smell nearly enough attention. “When you think about how important the olfactive is in almost every type of communication,” he has said, “its absence in global communication is sort of astounding.”
He may be right. We tap on backlit screens and listen to all manner of media through headphones, but somehow scent—with its remarkable ability to tell stories and evoke emotion—still hasn’t broken into the technological mainstream. Edwards hopes to change that.
This week, Edwards, co-creator of a scent-sending device called oPhone, is launching oNotes. It’s an iPad app that brings together all the applications you can use with oPhone. It aims to be the home of all of your olfactory media—scent-augmented movies, books, photos and music. Edwards describes oNotes as the “iTunes of scent,” the control center for a new era of sensory experience focused on making smell as integral to media consumption as sight and sound.
The 2011 MasterChef champion Tim Anderson has a new apprentice--a robo-chef.
“It’s the ultimate sous-chef,” Anderson told BBC News. “You tell it to do something--whether it’s a bit of prep or completing a whole dish from start to finish--and it will do it.”
We’re already giving robots weapons, so why not let them take over our homes too? The London-based company Moley Robotics is demonstrating their new robot chef prototype at Hannover Messe, an annual trade fair for the industrial technology. The robot’s first dish will be crab bisque.
According to the company, the mechanical chef, which incorporates 20 motors, 24 joints and 129 sensors, learns how to cook by watching a plain old human chef, whose movements are turned into commands that drive the robot hands. Moley hopes to eventually create a product that can do everything from preparing the ingredients to cleaning up the kitchen, and include a built-in refrigerator and dishwasher.
The idea is to support the robot with thousands of app-like recipes, and it would allow owners to share their special recipes online.
Can robo-chef handle the complexities of cooking? Rich Walker, whose company Shadow Robot designed the machine, thinks it can overcome challenges like deciding when beaten eggs have peaked.
“Something would change; we would see it in the sensor data. Maybe something gets stiffer or softer,” he said. “We should be able to sense that and use it as the point to transition to the next stage of the cooking process.”
Mini, BMW's Britain-based small-car subsidiary, will unveil a fun new concept at the Shanghai Motor Show later this month. But it's not a new car. It's, basically, a cooler version of Google Goggles.
For starters, these actually look kind of like goggles, as in the kind World War I fighter pilots used to wear. Style matters since these aren't meant meant only to be worn inside your Mini Cooper.
If you've ever driven to a destination and then not known where to go once you got out of your car, Mini's Augmented Vision Goggles have an app just for that. Navigation prompts displayed inside the goggle's lenses will guide you, step by step, right to the front door. Likewise, if you forget where you parked your car, the goggles can guide you back.
Once you're in the car, the goggles will show navigation and other information like your speed and the speed limit for the road you're on. Even as the driver turns his head, data is always shown at a consistent spot just above the steering wheel.
Airliner cabins can get pretty germy. They're packed full of people from all over the world, who spend hours doing things like coughing, sneezing and touching surfaces with their grubby li'l hands. It was with this in mind that Arthur Kreitenberg and his son Mo created the GermFalcon. It's a robot that kills germs on planes, using ultraviolet light.
First of all, no, it doesn't roam around amongst the passengers while the airplane is in flight. Instead, it's intended for use between flights, while the aircraft is parked and empty.
The wheeled robot has the same footprint as an onboard drinks cart, so it's able to autonomously move down the aisle unimpeded – with the help of a proximity sensor. As it does so, it spreads its two "wings" over the seats on either side. Those wings contain UV-C lamps, which are the same type used for disinfection in places like hospitals and water treatment plants. It also has UV-C lamps on its top and sides.
According to the Kreitenbergs, in tests conducted on airliner seating areas, exposure to those lights killed 99.99 percent of microbes within 10 minutes.
Tel Aviv University researchers hope to turn smartphones into powerful hyperspectral sensors that determine precise spectral data for each pixel in an image.
As with the Star Trek tricorder,* the enhanced smartphones would be capable of identifying the chemical components of objects from a distance, based on unique hyperspectral signatures.
The technology combines an optical component and image processing software, according to Prof. David Mendlovic of TAU’s School of Electrical Engineering and his doctoral student, Ariel Raz.
The researchers, together with spinoff Unispectral Technologies, have patented an optical component based on existing microelectromechanical (MEMS) technology. The design is suitable for mass production and compatible with standard smartphone camera designs.
Unispectral is in talks with other companies to analyze the images, using a large database of hyperspectral signatures. A prototype is scheduled for release in June, says Mendlovic.
Applications of the sensor include consumer electronics, the automotive industry, biotechnology, homeland security, remote health monitoring, industrial quality control, and agricultural crop identification, according to Mendlovic.
Unispectral’s funders include Sandisk and Momentum Fund, which is backed by Tata Group Ltd. and Singapore-based Temasek.
A new computer program could soon analyze your “selfie” videos for clues to mental health.
Apps to monitor people’s health can track the spread of the flu, for example, or provide guidance on nutrition and managing mental health issues.
Jiebo Luo, professor of computer science at the University of Rochester, explains that his team’s approach is to “quietly observe your behavior” while you use the computer or phone as usual.
He adds that their program is “unobtrusive.” Users won’t need to wear special gear, describe their feelings, or add any extra information, he says.
From Tweets to forehead color
For example, the team was able to measure a user’s heart rate simply by monitoring very small, subtle changes in the user’s forehead color. The system does not grab other data that might be available through the phone—such as the user’s location.
The researchers were able to analyze the video data to extract a number of “clues,” such as heart rate, blinking rate, eye pupil radius, and head movement rate. At the same time, the program also analyzed both what the users posted on Twitter, what they read, how fast they scrolled, their keystroke rate, and their mouse click rate.
Not every bit of information is treated equally, however: what a user tweets, for example, is given more weight than what the user reads because it is a more direct expression of what that user is thinking and feeling.
Sophie de Oliveira Barata runs the Alternative Limb Project, creating unique prosthetics designed to reflect the wearer's personality
Sophie de Oliveira Barata started her career making realistic-looking artificial limbs for amputees.
But at university she had studied special effects prosthetics for TV and film, and wondered if she could use her skills to make limbs that looked more unusual and "spoke from people's soul".
Sophie set up the Alternative Limb Project and now makes bespoke, design-focused prosthetics from materials such wood, glass and metal that reflect the wearer's personality and imagination, as well as making ultra-realistic limbs.
Among others, she has designed limbs for model and singer-songwriter Viktoria Modesta and athlete Jo-Jo Cranfield.
Following in the footsteps of Hiroshi Ishiguro's eerily lifelike creations, Toshiba introduced its very own take on the human-looking droid at Japan's CEATAC electronics trade show this week. The communication android has been built to communicate in Japanese sign language, requiring fluid and precise movement of its arms and hands.
The result of an in-house ideas program, the android has the look of a young Japanese woman, complete with blinking eyes and a "warm smile." Its human-like appearance and ability to emulate human expressions come courtesy of work undertaken by aLab Inc. and Osaka University, while the Shibaura Institute of Technology brought driving and sensor technologies to the party. Toshiba used its experience with industrial robots to create a custom algorithm to facilitate the movement of 43 actuators in the robot's joints.
It's still early days for the project, with the signing robot currently capable of simple greetings and phrases only, though Toshiba is aiming to have progressed to such a degree that the android will be capable of acting as a receptionist or exhibition guide within the next year.
There are plans to introduce speech recognition and synthesis technology for natural communication, with development continuing towards introducing a welfare and healthcare service robot for the elderly and folks suffering dementia by 2020, allowing carers or family members to keep watch on loved ones.
Benjamin Wittes and Jane Chong examine how the law will respond as we become more cyborg-like, and the divide between human and machine becomes ever-more unstable. In particular, they consider how the law of surveillance will shift as we develop from humans who use machines into humans who partially are machines or, at least, who depend on machines pervasively for our most human-like activities.
Implant attached to bone in pioneering technique that helps prevent infection and discomfort
Revolutionary technology at a north London hospital has transformed the lives of amputees taking part in a trial by allowing artificial limbs to be attached directly to their skeleton, giving them feeling and mobility far beyond that experienced by people with traditional prosthetics.
Unlike traditional ball-and-socket joints where a socket is placed over the soft tissue of the stump, Itap (intraosseous transcutaneous amputation prosthesis) involves insertion of a metal implant that forms a direct interface with the bone and sticks out through the skin for the prosthetic to be attached.
If the trial conducted at the Royal National Orthopaedic hospital (RNOH) and the Royal Orthopaedic hospital in Birmingham, which ended in June, is deemed a success, Itap could be rolled out across the UK and internationally through specialist clinics.
Mark O'Leary, 40, from south London, was one of the first of 20 above-the-knee amputees to take part in the trial. He described the change it had made to his life. "Just knowing where my foot is, my ability to know where it is improved dramatically because you can feel it through the bone. A textured road crossing, I can feel that. You essentially had no sensation with a socket and with Itap you can feel everything," he said.
"It's like they've given me my leg back. I know that sounds a bit trite. With this thing I just click the stump on in the morning and I can walk as far as I like, do anything I want within reason. There's no limit."
Using an Android tablet and the video game Angry Birds, children can program a robot to learn new skills.
Because end users can easily program the robot to learn tasks, researchers envision the robot-smart tablet system as a future rehabilitation tool for children with cognitive and motor-skill disabilities.
The researchers paired a small humanoid robot with an Android tablet. Kids teach it how to play Angry Birds by dragging their finger on the tablet to whiz the bird across the screen. The robot watches what happens and records “snapshots” in its memory.
The machine notices where fingers start and stop, and how the objects on the screen move according to each other, while constantly keeping an eye on the score to check for signs of success.
When it’s the robot’s turn, it mimics the child’s movements and plays the game. If the bird is a dud and doesn’t cause any damage, the robot shakes its head in disappointment. If the building topples and points increase, the eyes light up and the machine celebrates with a happy sound and dance.
Nasa plans to send Google's 3D smartphones into space to function as the "eyes and brains" of free-flying robots inside the Space Station.
The robots, known as Spheres (Synchronised Position Hold, Engage, Reorient, Experimental satellites), currently have limited capabilities.
It is hoped the smartphones, powered by Google's Project Tango, will equip the robots with more functionality.
The robots have been described by experts as "incredibly clever".
When Nasa's robots first arrived at the International Space Station in 2006, they were only capable of precise movements using small jets of CO2, which propelled the devices forwards at around an inch per second.
"We wanted to add communication, a camera, increase the processing capability, accelerometers and other sensors," Spheres project manager Chris Provencher told Reuters.
"As we were scratching our heads thinking about what to do, we realised the answer was in our hands. Let's just use smartphones."
In an attempt to make the robots smarter and of more use to astronauts, engineers at Nasa's Ames Research Centre sent cheap smartphones to the space station, which they had purchased from Best Buy, an American electronics shop.
Astronauts then attached the phones to the Spheres, giving them more visual and sensing capabilities.
Nowadays 3D printing is increasingly used for medical purposes and body upgrades to design devices, implants, and a variety of customized prosthetics, from a 3D printed face, to a skull, and even organs.
In the future we may look at the world with new – artificial, 3D printed – eyes. Italian research studio MHOX is working on EYE, a 3D bioprinted sight augmentation. The project envisions the removal of the natural visual system and its replacement with a digitally designed 3D printed one. The original retina would be replaced by a new artificial network, able to offer enhanced vision, WiFi connection and the possibility to record video and take pictures.
In the hope of curing blindness and healing conditions, giving better vision, the 3D printed eyes are expected to be available by 2027.
The eyeballs will be constructed with the use of a bio-ink that contains the cells required to replace those found in natural eyes. Once the original pair of eyes is surgically removed, researchers plan on connecting the 3D printed one to a deck inside the head, which would allow the eyes to be inserted.
“We envision that the link between the deck and the EYE will be based on attractive forces between the tissues more than mechanical joints” MHOX designer Filippo Nassetti explains. “To replace the EYE the user only has to put it in position inside the skull, and the tissues of the Deck and the EYE connect automatically”.
Researchers at the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) have found a new way of manipulating the walls that define magnetic domains (uniform areas in magnetic materials) and the results could one day revolutionize the electronics industry, they say.
Gong Chen and Andreas Schmid, experts in electron microscopy with Berkeley Lab’s Materials Sciences Division, led the discovery of a technique by which the “spin textures” of magnetic domain walls in ultrathin magnets can be switched between left-handed, right-handed, cycloidal, helical and mixed structures.
Electronic memory and logic
The “handedness” or “chirality” of spin texture determines the movement of a magnetic domain wall in response to an electric current, so this technique, which involves the strategic application of uniaxial strain, should lend itself to the creation of domains walls designed for desired electronic memory and logic functions.
“The information sloshing around today’s Internet is essentially a cacophony of magnetic domain walls being pushed around within the magnetic films of memory devices,” says Schmid. “Writing and reading information today involves mechanical processes that limit reliability and speed. Our findings pave the way to use the spin-orbit forces that act upon electrons in a current to propel magnetic domain walls either in the same direction as the current, or in the opposite direction, or even sideways, opening up a rich new smorgasbord of possibilities in the field of spin-orbitronics.”
The study was carried out at at the National Center for Electron Microscopy (NCEM), which is part of the Molecular Foundry, a DOE Office of Science User Facility. The results have been reported in a Nature Communications paper titled “Unlocking Bloch-type chirality in ultrathin magnets through uniaxial strain.” Chen and Schmid are the corresponding authors. Other co-authors are Alpha N’Diaye, Sang Pyo Kang, Hee Young Kwon, Changyeon Won, Yizheng Wu and Z.Q. Qiu.
My dad has a story he likes to tell about one of his friends, a scientist. The scientist was giving a lecture in Japan, and opened with a joke that lasted a couple minutes. After delivering the joke in English, he waited for his translator to relay it to the audience. The translator spoke for only a few seconds, and then the crowd burst out laughing.
After the presentation was over, the scientist asked the translator how she managed to distill the humor of his joke down into such a concise form. She shrugged and said, “I said that the American visitor just told a very funny joke, and that they should all laugh now.”
The scientist's story illustrates the subjective, human quality of translation. Moving between languages is rarely a matter of transposing literal meaning; it requires the constant triage of unexpected inputs, endless judgment calls, and some social awareness. In other words, it’s something that humans are cut out for, and that computers are not.
Purdue University researchers have developed a potential manufacturing method called “mechanically sintered gallium-indium nanoparticles” that can inkjet-print flexible, stretchable conductors onto anything — including elastic materials and fabrics — and can mass-produce electronic circuits made of liquid-metal alloys for “soft robots” and flexible electronics.
The method uses ultrasound to break up liquid metal into nanoparticles in ethanol solvent to make ink that is compatible with inkjet printing.
Elastic technologies could make possible a new class of pliable robots and stretchable garments that people might wear to interact with computers or for therapeutic purposes.
“Liquid metal in its native form is not inkjet-able,” said Rebecca Kramer, an assistant professor of mechanical engineering at Purdue. “So what we do is create gallium-indium liquid metal nanoparticles that are small enough to pass through an inkjet nozzle.
“Sonicating [using ultrasound] liquid metal in a carrier solvent, such as ethanol, both creates the nanoparticles and disperses them in the solvent. Then we can print the ink onto any substrate. The ethanol evaporates away so we are just left with liquid metal nanoparticles on a surface.”
After printing, the nanoparticles must be rejoined by applying light pressure, which renders the material conductive. This step is necessary because the liquid-metal nanoparticles are initially coated with oxidized gallium, which acts as a skin that prevents electrical conductivity.
“But it’s a fragile skin, so when you apply pressure it breaks the skin and everything coalesces into one uniform film,” Kramer said. “We can do this either by stamping or by dragging something across the surface, such as the sharp edge of a silicon tip.”
After spending a week walking the showroom floors of CES, a wearable claiming to change your mood is probably going to activate your BS sensors. But today our demo of the Thync wearable was the rare CES meeting that's everything it's pretending to be – possibly more. Your neighborhood drug dealer might want to start looking for a new line of work.
The Thync has some similarities to TENS units (like those found in Chiropractor's offices), but instead of slapping pads onto your lower back, you place them on your head. It uses "neurosignaling" to either calm you down or energize you.
As the company explains, "Neurosignaling uses electronic or ultrasonic waveforms to signal neural pathways in the brain. When specific pathways are stimulated, they trigger a shift in your state of mind or energy level."
Are your BS sensors going off yet? If so, we don't blame you. The technology world is full of stuff that sounds almost exactly like this, and most of it is about as authentic as Milli Vanilli.
But Thync works. During our demo with the Thync team, I tried the calming mode followed by the energized mode, and it was like drugs – minus all the bad stuff. More specifically, the calming mode was much like smoking a joint (minus the munchies, bloodshot eyes and memory loss). And though we were expecting the energizing mode to be similar to caffeine, it was more like the effects of Ephedrine (I used it a few times back in the 90s, before it started killing athletes, when it was sold over-the-counter). Rather than an antsy, over-caffeinated state, I found it to be more like a stimulated clarity – like a veil of fuzzy grogginess that I wasn't even aware of had been lifted.
What if you could ask your smartphone for diet and exercise advice, the same way you ask Siri for driving directions?
What if you could ask your smartphone for diet and exercise advice, the same way you ask Siri for driving directions?
Biotechnology company Pathway Genomics will soon offer an app that promises to do just that. “It’s meant to allow patients to be the CEO of their own health,” says Pathway Genomics CEO Jim Plante. “It will provide genomic information. It will pull in the patients health records, connect to activity monitors like the Fitbit.”
It will also tap into IBM Watson, the machine learning system based on the supercomputer the company used to win at TV Jeopardy. The Watson online service contains a wealth of information from sources such as medical text books as well as the latest medical research journals, and IBM will use this to help power the Pathway Genomics app, after investing an undisclosed amount in the startup.
The app is just one of many—oh, so many—apps and devices aiming to improve our health through mobile and even wearable technology. Google and Apple are inviting developers to build health tools atop its wearable hardware, and various independent projects are moving in the same direction.
Though this is the first time Watson has ventured into consumer applications, it isn’t new to healthcare. One of the first uses outside of Jeopardy was at Cedars-Sinai Hospital’s Samuel Oschin Comprehensive Cancer Institute, where doctors were able to use the the supercomputer to help diagnose illnesses. But the Panorama app will be the first time patients—as opposed to doctors—will have the chance to ask questions of the Watson platform directly.
In a world where economy-class seats are getting thinner and lavatories are shrinking, any flight longer than an hour can feel like a traveling prison. Aircraft manufacturer Airbus is abetting the shift, but a recent patent filing shows it hasn’t forgotten about you, the passenger who actually has to sit in these miserable flying cells. It’s considering helmets that will let you forget you’re in an airplane at all.
Flying can be boring or stressful, which is why airlines provide music, movies and bad TV. The next step appears to be thoroughly immersing passengers in what they’re watching. “The helmet in which the passenger houses his/her head offers him/her sensorial isolation with regard to the external environment,” reads the patent filing.
The helmets feature headphones to provide music. You can watch movies (perhaps in 3D) on the “opto-electronic” screen or possibly through “image diffusion glasses.” If you want to get some work done, turn on the virtual keyboard, which appears on your tray, don a pair of motion capture gloves, and type away. The helmet could even pipe in different odors for an olfactory treat, and the whole thing would be nicely ventilated.
The MICA bracelet displays messages and calendar alerts.
If you love jewelry, Intel has unveiled a sparkling bracelet that’s also a stand-alone message display device.
Unveiled for New York Fashion Week, My Intelligent Communication Accessory, or MICA, has glamorous looks as well as 3G cellular connectivity, so it doesn’t need to be tethered to a smartphone.
Designed by Humberto Leon and Carol Lim of fashion house Opening Ceremony, MICA is a cuff-style accessory covered with snakeskin as well as semiprecious stones such as obsidian and lapis. It will be available in two styles, one with white snakeskin and the other with black snakeskin, each with different stones.
The 1.6-inch sapphire-glass touchscreen can display SMS messages relayed through the bracelet’s Intel XMM6321 3G cellular radio. It can also display calendar alerts.
The bracelet will be sold as an Opening Ceremony product. Its weight and price have not been revealed yet, but it will be sold through some Barneys and Opening Ceremony stores by the December holiday season.
The chipmaker has been emphasizing wearables sold through other companies as mobile technology has put PCs in the shadows in the recent years.
Intel announced the collaboration with Opening Ceremony at CES 2014, where it also showed off smart earbuds that can measure a runner’s heart rate.
In August, SMS Audio announced biometric headphones based on Intel’s technology.
The MICA bracelet also follows Intel’s acquisition of health-tracking wristband maker Basis Science in March.
North Carolina State University researchers have developed methods for electronically manipulating the flight muscles of moths and for monitoring the electrical signals that moths use to control those muscles. The goal: remotely-controlled moths, or “biobots,” for use in emergency response, such as search and rescue operations.
“The idea would be to attach sensors to moths … to create a flexible, aerial sensor network that can identify survivors or public health hazards in the wake of a disaster,” said Alper Bozkurt, PhD, an assistant professor of electrical and computer engineering at NC State and co-author of a JOVE paper on the work.
Bozkurt, with Amit Lal, PhD, of Cornell University, previously developed a method for attaching electrodes to a moth during its pupal stage, when the caterpillar is in a cocoon undergoing metamorphosis. Now, Bozkurt’s research team wants to find out precisely how a moth coordinates its muscles during flight.
Patients are more willing to disclose personal information to virtual humans, likely because they don't have the capacity to judge.
The findings show promise for people suffering from post-traumatic stress and other mental anguish, says Gale Lucas, a social psychologist at University of Southern California’s Institute for Creative Technologies.
“Today there’s no legislation regarding how much intelligence a machine can have, how interconnected it can be. If that continues, look at the exponential trend. We will reach the singularity in the timeframe most experts predict. From that point on you’re going to see that the top species will no longer be humans, but machines.”
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.