Cyborg Lives
Follow
16.6K views | +0 today
 
Scooped by Wildcat2030
onto Cyborg Lives
Scoop.it!

The Robot Will See You Now-IBM's Watson—the same machine that beat Ken Jennings at Jeopardy—is now churning through case histories at Memorial Sloan-Kettering, learning to make diagnoses and treatm...

The Robot Will See You Now-IBM's Watson—the same machine that beat Ken Jennings at Jeopardy—is now churning through case histories at Memorial Sloan-Kettering, learning to make diagnoses and treatm... | Cyborg Lives | Scoop.it
IBM's Watson—the same machine that beat Ken Jennings at Jeopardy—is now churning through case histories at Memorial Sloan-Kettering, learning to make diagnoses and treatment recommendations.


Charley lukov didn’t need a miracle. He just needed the right diagnosis. Lukov, a 62-year-old from central New Jersey, had stopped smoking 10 years earlier—fulfilling a promise he’d made to his daughter, after she gave birth to his first grandchild. But decades of cigarettes had taken their toll. Lukov had adenocarcinoma, a common cancer of the lung, and it had spread to his liver. The oncologist ordered a biopsy, testing a surgically removed sample of the tumor to search for particular “driver” mutations. A driver mutation is a specific genetic defect that causes cells to reproduce uncontrollably, interfering with bodily functions and devouring organs. Think of an on/off switch stuck in the “on” direction. With lung cancer, doctors typically test for mutations called EGFR and ALK, in part because those two respond well to specially targeted treatments. But the tests are a long shot: although EGFR and ALK are the two driver mutations doctors typically see with lung cancer, even they are relatively uncommon. When Lukov’s cancer tested negative for both, the oncologist prepared to start a standard chemotherapy regimen—even though it meant the side effects would be worse and the prospects of success slimmer than might be expected using a targeted agent.

But Lukov’s true medical condition wasn’t quite so grim. The tumor did have a driver—a third mutation few oncologists test for in this type of case. It’s called KRAS. Researchers have known about KRAS for a long time, but only recently have they realized that it can be the driver mutation in metastatic lung cancer—and that, in those cases, it responds to the same drugs that turn it off in other tumors. A doctor familiar with both Lukov’s specific medical history and the very latest research might know to make the connection—to add one more biomarker test, for KRAS, and then to find a clinical trial testing the efficacy of KRAS treatments on lung cancer. But the national treatment guidelines for lung cancer don’t recommend such action, and few physicians, however conscientious, would think to do these things.

more...
No comment yet.

From around the web

Cyborg Lives
Understanding our Cyborg lives, redescribing our reality
Curated by Wildcat2030
Your new post is loading...
Your new post is loading...
Scooped by Wildcat2030
Scoop.it!

Microsoft’s HoloLens Augmented Reality Headset Will Put Virtual Acrobats and Golfers in Your Living Room | MIT Technology Review

Microsoft’s HoloLens Augmented Reality Headset Will Put Virtual Acrobats and Golfers in Your Living Room | MIT Technology Review | Cyborg Lives | Scoop.it
Demonstrations of augmented-reality displays typically involve tricking you into seeing animated content such as monsters and robots that aren’t really there. Microsoft wants its forthcoming HoloLens headset to mess with reality more believably. It has developed a way to make you see photorealistic 3-D people that fit in with the real world.

With this technology, you could watch an acrobat tumble across your front room or witness your niece take some of her first steps. You could walk around the imaginary people just as if they were real, your viewpoint changing seamlessly as if they were actually there. A sense of touch is just about the only thing missing.

That experience is possible because Microsoft has built a kind of holographic TV studio at its headquarters in Redmond, Washington. Roughly 100 cameras capture a performance from many different angles. Software uses the different viewpoints to create a highly accurate 3-D model of the person performing, resulting in a photo-real appearance.

The more traditional approach of using computer animation can’t compare, according to Steve Sullivan, who works on the project at Microsoft. He demonstrated what Microsoft calls “video holograms” at the LDV Vision Summit, an event about image-processing technology, in New York on Tuesday. More details of the technology will be released this summer.

“There’s something magical about it being real people and motion,” he said. “If you have a HoloLens, you really feel these performances are in your world.”

Microsoft is working on making it practical and cheap enough for other companies to record content in this form. It might one day be possible to visit a local studio and record a 3-D snapshot of a child at a particular point in life, said Sullivan.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Breakthrough bionic leg prosthesis controlled by subconscious thoughts

Breakthrough bionic leg prosthesis controlled by subconscious thoughts | Cyborg Lives | Scoop.it
Biomedical engineering company Össur has announced the successful development of a thought controlled bionic prosthetic leg. The new technology uses implanted sensors sending wireless signals to the artificial limb's built-in computer, enabling subconscious, real-time control and faster, more natural responses and movements.

Prosthetics controlled by muscle impulses have been around since the late 1960s, but the technology has severe limitations. It works by laying sensors on the skin of the vestigial limb, which picks up electrical impulses that control, for example, an artificial arm. The trouble is, these sensors pick up electric impulses from more than one muscle. This degrades performance, requires a lot of practice to operate properly, and makes the prosthesis slow, Imprecise, and frustrating to use.

One answer to this is to use more precise sensor arrangements that make the limb, for all practical purposes, mind-controlled. The method is already used with great success on upper limbs and even artificial hands, but, paradoxically, it's been less successful with lower limbs.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Machine-Learning Algorithm Mines Rap Lyrics, Then Writes Its Own | MIT Technology Review

Machine-Learning Algorithm Mines Rap Lyrics, Then Writes Its Own | MIT Technology Review | Cyborg Lives | Scoop.it
The ancient skill of creating and performing spoken rhyme is thriving today because of the inexorable rise in the popularity of rapping. This art form is distinct from ordinary spoken poetry because it is performed to a beat, often with background music.

And the performers have excelled. Adam Bradley, a professor of English at the University of Colorado has described it in glowing terms. Rapping, he says, crafts “intricate structures of sound and rhyme, creating some of the most scrupulously formal poetry composed today.”

The highly structured nature of rap makes it particularly amenable to computer analysis. And that raises an interesting question: if computers can analyze rap lyrics, can they also generate them?

Today, we get an affirmative answer thanks to the work of Eric Malmi at the University of Aalto in Finland and few pals. These guys have trained a machine-learning algorithm to recognize the salient features of a few lines of rap and then choose another line that rhymes in the same way on the same topic. The result is an algorithm that produces rap lyrics that rival human-generated ones for their complexity of rhyme.

Various forms of rhyme crop up in rap but the most common, and the one that helps distinguish it from other forms of poetry, is called assonance rhyme. This is the repetition of similar vowel sounds such as in the words “crazy” and “baby” which share two similar vowel sounds. (That’s different from consonance, which uses similar consonant sounds, such as in “pitter patter” and different from perfect rhyme where words share the same ending sound such as “slang” and “gang.”)

Because of its prevalence in rap, Malmi and co focus exclusively on the way assonance appears in rap lyrics. But they also assume a highly structured form of verse consisting of 16 lines, each of which equals one musical bar and so must be made up of four beats. The lines typically, but not necessarily, rhyme at the end.

To train their machine learning algorithm, they begin with a database of over 10,000 songs from more than 100 rap artists.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Teaching Robots to Appreciate Poetry

Teaching Robots to Appreciate Poetry | Cyborg Lives | Scoop.it
Over the course of 1967 and 1968, Argentine writer Jorge Luis Borges delivered a series of lectures at Harvard about the nature of human language. In one of these lectures, he spent a good deal of time ruminating on the importance of metaphor and its limitless possibilities in language. Borges theorized that despite these boundless possibilities for poetic language, there were nevertheless distinct patterns of metaphors that kept cropping up—a favorite example of his being the metaphorical equivalence of "stars" and "eyes."

It was this lecture series given by the surrealist writer that inspired Poetry for Robots, a project launched last week through a partnership between Neologic, Webvisions, and The Center for Science and the Imagination at Arizona State University. The project seeks to put Borges’ theory to the test, asking on their website whether it is possible to teach machines the poetic quality of human language.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

L'Oreal to start 3D-printing skin - BBC News

L'Oreal to start 3D-printing skin - BBC News | Cyborg Lives | Scoop.it
French cosmetics firm L'Oreal is teaming up with bio-engineering start-up Organovo to 3D-print human skin.

It said the printed skin would be used in product tests.

Organovo has already made headlines with claims that it can 3D-print a human liver but this is its first tie-up with the cosmetics industry.

Experts said the science might be legitimate but questioned why a beauty firm would want to print skin.

L'Oreal currently grows skin samples from tissues donated by plastic surgery patients. It produces more than 100,000, 0.5 sq cm skin samples per year and grows nine varieties across all ages and ethnicities.

Its statement explaining the advantage of printing skin, offered little detail: "Our partnership will not only bring about new advanced in vitro methods for evaluating product safety and performance, but the potential for where this new field of technology and research can take us is boundless."
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

How we made an octopus-inspired surgical robot using coffee

How we made an octopus-inspired surgical robot using coffee | Cyborg Lives | Scoop.it
The unparalleled motion and manipulation abilities of soft-bodied animals such as the octopus have intrigued biologists for many years. How can an animal that has no bones transform its tentacles from a soft state to a one stiff enough to catch and even kill prey?

A group of scientists and engineers has attempted to answer this question in order to replicate the abilities of an octopus tentacle in a robotic surgical tool. Last week, members of this EU-funded project known as STIFF-FLOP (STIFFness controllable Flexible and Learnable manipulator for surgical OPerations) unveiled the group’s latest efforts.

Conventional surgical robots are based on structures made from rigid linked components. This means they can only reach sites inside a patient’s abdomen by moving along straight lines and cannot navigate around organs that may be in the way. It also means they risk damaging healthy tissue during an operation.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Robot pets could change human relationship with animals, researcher says

Robot pets could change human relationship with animals, researcher says | Cyborg Lives | Scoop.it
Robot pets could be common place in 10 years' time and change the way we interact and relate with the real things, a Melbourne researcher believes.

Dr Jean-Loup Rault from Melbourne University studies animal welfare and the way humans and animals interact with each other.

Recently, he has been looking into how technology has changed the way we relate to animals and pets.

"We know very little about robotic pets, virtual animals online and what they actually do to people," Dr Rault said.

"Is that going to change the way we relate to animals? Can that be a substitute to a live pet?

"Technology is moving very fast. The Tamagotchi in the early 1990s was really a prototype of a robotic pet and now Sony and other big companies have elaborated a lot on what have become robotic animals."

He said humans were able to become emotionally attached to objects.

"There's anecdotal evidence and a few studies that show people actually develop a bond, some kind of emotional attachment to those robots," Dr Rault said.

"They know it's not a live pet, they don't consider it as a live animal but they also don't consider it a mere object.

"It has an intermittent status between that of an animal and that of an object that projects some kind of personality."
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

A softer, gentler robot controlled by light | KurzweilAI

A bio-inspired prototype “soft robot” material with greater dexterity and mobility than conventional hard robots has been created by researchers at the University of Pittsburgh’s Swanson School of Engineering.

“In biology, directed movement involves some form of shape changes, such as the expansion and contraction of muscles,” said Anna C. Balazs, PhD, the Swanson School’s Distinguished Professor of Chemical and Petroleum Engineering. “So we asked whether we could mimic these basic interconnected functions in a synthetic system so that it could simultaneously change its shape and move.”
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

This New App Wants to Be the iTunes of Smells | WIRED

This New App Wants to Be the iTunes of Smells | WIRED | Cyborg Lives | Scoop.it
David Edwards has strong opinions on scent. His big theory: We don’t give smell nearly enough attention. “When you think about how important the olfactive is in almost every type of communication,” he has said, “its absence in global communication is sort of astounding.”

He may be right. We tap on backlit screens and listen to all manner of media through headphones, but somehow scent—with its remarkable ability to tell stories and evoke emotion—still hasn’t broken into the technological mainstream. Edwards hopes to change that.

This week, Edwards, co-creator of a scent-sending device called oPhone, is launching oNotes. It’s an iPad app that brings together all the applications you can use with oPhone. It aims to be the home of all of your olfactory media—scent-augmented movies, books, photos and music. Edwards describes oNotes as the “iTunes of scent,” the control center for a new era of sensory experience focused on making smell as integral to media consumption as sight and sound.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

A Pair Of Robot Arms Could Make You Dinner

A Pair Of Robot Arms Could Make You Dinner | Cyborg Lives | Scoop.it
The 2011 MasterChef champion Tim Anderson has a new apprentice--a robo-chef.

“It’s the ultimate sous-chef,” Anderson told BBC News. “You tell it to do something--whether it’s a bit of prep or completing a whole dish from start to finish--and it will do it.”

We’re already giving robots weapons, so why not let them take over our homes too? The London-based company Moley Robotics is demonstrating their new robot chef prototype at Hannover Messe, an annual trade fair for the industrial technology. The robot’s first dish will be crab bisque.

According to the company, the mechanical chef, which incorporates 20 motors, 24 joints and 129 sensors, learns how to cook by watching a plain old human chef, whose movements are turned into commands that drive the robot hands. Moley hopes to eventually create a product that can do everything from preparing the ingredients to cleaning up the kitchen, and include a built-in refrigerator and dishwasher.

The idea is to support the robot with thousands of app-like recipes, and it would allow owners to share their special recipes online.

Can robo-chef handle the complexities of cooking? Rich Walker, whose company Shadow Robot designed the machine, thinks it can overcome challenges like deciding when beaten eggs have peaked.

“Something would change; we would see it in the sensor data. Maybe something gets stiffer or softer,” he said. “We should be able to sense that and use it as the point to transition to the next stage of the cooking process.”
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

BMW's 'Mini' to unveil augmented reality driving goggles in China

BMW's 'Mini' to unveil augmented reality driving goggles in China | Cyborg Lives | Scoop.it
Mini, BMW's Britain-based small-car subsidiary, will unveil a fun new concept at the Shanghai Motor Show later this month. But it's not a new car. It's, basically, a cooler version of Google Goggles.

For starters, these actually look kind of like goggles, as in the kind World War I fighter pilots used to wear. Style matters since these aren't meant meant only to be worn inside your Mini Cooper.

If you've ever driven to a destination and then not known where to go once you got out of your car, Mini's Augmented Vision Goggles have an app just for that. Navigation prompts displayed inside the goggle's lenses will guide you, step by step, right to the front door. Likewise, if you forget where you parked your car, the goggles can guide you back.

Once you're in the car, the goggles will show navigation and other information like your speed and the speed limit for the road you're on. Even as the driver turns his head, data is always shown at a consistent spot just above the steering wheel.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

GermFalcon robot is made to sanitize airliners

GermFalcon robot is made to sanitize airliners | Cyborg Lives | Scoop.it
Airliner cabins can get pretty germy. They're packed full of people from all over the world, who spend hours doing things like coughing, sneezing and touching surfaces with their grubby li'l hands. It was with this in mind that Arthur Kreitenberg and his son Mo created the GermFalcon. It's a robot that kills germs on planes, using ultraviolet light.

First of all, no, it doesn't roam around amongst the passengers while the airplane is in flight. Instead, it's intended for use between flights, while the aircraft is parked and empty.

The wheeled robot has the same footprint as an onboard drinks cart, so it's able to autonomously move down the aisle unimpeded – with the help of a proximity sensor. As it does so, it spreads its two "wings" over the seats on either side. Those wings contain UV-C lamps, which are the same type used for disinfection in places like hospitals and water treatment plants. It also has UV-C lamps on its top and sides.

According to the Kreitenbergs, in tests conducted on airliner seating areas, exposure to those lights killed 99.99 percent of microbes within 10 minutes.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

A hyperspectral smartphone-based Star Trek ‘tricorder’ | KurzweilAI

A hyperspectral smartphone-based Star Trek ‘tricorder’ | KurzweilAI | Cyborg Lives | Scoop.it
Tel Aviv University researchers hope to turn smartphones into powerful hyperspectral sensors that determine precise spectral data for each pixel in an image.

As with the Star Trek tricorder,* the enhanced smartphones would be capable of identifying the chemical components of objects from a distance, based on unique hyperspectral signatures.

The technology combines an optical component and image processing software, according to Prof. David Mendlovic of TAU’s School of Electrical Engineering and his doctoral student, Ariel Raz.

The researchers, together with spinoff Unispectral Technologies, have patented an optical component based on existing microelectromechanical (MEMS) technology. The design is suitable for mass production and compatible with standard smartphone camera designs.

Unispectral is in talks with other companies to analyze the images, using a large database of hyperspectral signatures. A prototype is scheduled for release in June, says Mendlovic.

Applications of the sensor include consumer electronics, the automotive industry, biotechnology, homeland security, remote health monitoring, industrial quality control, and agricultural crop identification, according to Mendlovic.

Unispectral’s funders include Sandisk and Momentum Fund, which is backed by Tata Group Ltd. and Singapore-based Temasek.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Augmentation in the Bedroom: Biohacking and the Inevitability of Cyborg Sex -

Augmentation in the Bedroom: Biohacking and the Inevitability of Cyborg Sex - | Cyborg Lives | Scoop.it
Have you ever wanted to upgrade your body? Make improvements to all those physical limitations? Build a better version of yourself?

How about augmenting your sex life?

Biohacking is a radical new scientific field that is as groundbreaking as it is notorious. Approaching the human body with what is described as a hacker ethic, it encompasses a wide variety of different practices, ranging from cybernetic augmentation to gene sequencing and biological manipulation.

Referring to themselves as Grinders, this community of fringe innovators aims to become the world’s first cyborgs, one implant at a time. They advocate an open-source upgrade culture, operating within a strange intersection of body modification, hardware fetishism, and home surgery.

And some of the advances they’re starting to make are truly incredible.

Basement enthusiasts are eagerly embracing the trend of magnetic implants. Eventually, we could all be seeing with expanded sensory devices, or streaming first-person POV footage straight to our computers.

We might even be having cyborg sex.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Next Big Future: A bionic lens can be inserted in a procedure like cataract surgery can give vision three times better than 20-20 starting in 2017

Next Big Future: A bionic lens can be inserted in a procedure like cataract surgery can give vision three times better than 20-20 starting in 2017 | Cyborg Lives | Scoop.it
Employing state-of-the art materials and production techniques, OcumeticsTM Technology Corporation is pleased to announce the development of one of the world’s most advanced intraocular lenses, one that is capable of restoring quality vision at all distances, without glasses, contact lenses or corneal refractive procedures, and without the vision problems that have plagued current accommodative and multifocal intraocular lens designs.

Cataract surgery is the most common and successful procedure in medicine. It is a painless and gentle procedure. Utilizing standard surgical techniques, augmented by the accuracy of femtosecond laser incision technology, ophthalmic surgeons will be able to implant the OcumeticsTM Bionic Lens to enable patients to achieve their visual goals.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Brain controlled prosthetics are finally coming to market (Wired UK)

Brain controlled prosthetics are finally coming to market (Wired UK) | Cyborg Lives | Scoop.it
Brain-controlled prosthetics could be widely available in three years time. Iceland-based orthopaedics company Ossur made the announcement after publicly demonstrating the working technology, currently being trialled by two volunteers.

However, given WIRED's May issue featured the story of a tetraplegic woman who could control a robotic arm using only her thoughts -- thanks to a series of electrodes linked to her brain -- you'd be forgiven for thinking brain-controlled prostheses were already par for the course.

And yes the tech, known as myoelectric prostheses, has been in development for years. They work by implanting tiny sensors into the muscle adjacent to the site of amputation, using salvaged nerves to send signals from the brain, via the sensor, to the prosthetic, where a receiver translates that message into movement. Ordinary electronic prostheses, including Ossur's original Proprio Foot, use algorithms to process data from sensors to predict a wearer's next movement. The company, which made Oscar Pistorius' Flex-Foot Cheetah blades, only delivered the upgraded version to two patients 14 months ago.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

These Sols Adaptiv high-tops are 3D printed, robotic and adapt to your feet

These Sols Adaptiv high-tops are 3D printed, robotic and adapt to your feet | Cyborg Lives | Scoop.it
The futuristic boots are made from a shell, 3D printed by specialists Shapeways, using a material called Elasto Plastic which is similar to nylon. The bonkers design is the work of Sols' collaborator on the project, Continuum Fashion.

But it doesn't stop there, a 3D printed inner boot can be completely customised to the wearer based on a 3D scan of the feet and ankles. And custom insoles inside, also 3D printed, will have air bags and air pockets to precisely alter the fit.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Less zen, but more efficient: How the digital age is really affecting our brains

Less zen, but more efficient: How the digital age is really affecting our brains | Cyborg Lives | Scoop.it
A comprehensive Microsoft study is offering insights into how living in the digital age is affecting our ability to sustain attention, and how our brains are adapting to the constant flow of new stimuli. Although the results confirmed the suspicions that the information overflow is affecting our ability to focus on one task for long periods of time, the news isn't all bad, as it seems we're also training our brains to multitask more effectively.
From zen to multi-tabbing

When the dinner guest of zen master Thich Nhat Hanh offered to wash the dishes before enjoying some tea together, the master asked his guest if he truly knew how to wash the dishes. For, said the master, there are two ways of doing so: washing the dishes in order to enjoy a cup of tea later on, and washing the dishes in order to wash the dishes. If one washes the dishes the first way, then he also won't be able to enjoy the tea, as his mind will again be solely preoccupied with what comes next. But in the second way, even the simplest of tasks becomes enjoyable.

In the age of constant smartphone notifications, flashy ads and extreme multitasking, it seems that keeping a zen-like focus on a single task for extended periods of time is increasingly becoming an utopia. And because we know that our brains are remarkably flexible, adapting to our habits and environment, it's interesting to ask how people (heavy technology users in particular) are being affected by the digital age.

At first, it would be sensible to assume that the never-ending flow of stimuli is hurting our attention spans, as we quickly become accustomed to switching from watching TV, multi-tabbing our internet browsers and tinkering with our smartphones in a constant, addictive search for the next dopamine hit.

But a comprehensive study by Microsoft revealed that things aren't quite as black and white. Attention cannot be reduced to a single figure, because different tasks require different types of attention. The Microsoft study distinguished between three types of attention – sustained (maintaining prolonged focus during repetitive activities), selective (avoiding distraction) and alternating (efficiently switching between tasks), and set out to understand how factors such as social media usage and and multi-screening behavior affected different types of attention.

The research consisted of a comprehensive survey of 2,000 Canadians of all ages, along with in-depth neurological surveys to better quantify attention spikes. And although some results came out as expected, there were a few surprises.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Is memory needed in a digital age? - BBC News

Is memory needed in a digital age? - BBC News | Cyborg Lives | Scoop.it
How many of your family or friends' phone numbers can you remember off the top of your head?

I only ask because increasingly we all rely on our electronic devices to remember such information for us.

But when the idea of allowing students to use search engines in exams was suggested recently, the immediate fear was "dumbing down".

Only a few years ago, there was a similar debate about the use of calculators.

For the 11-year-olds sitting their national curriculum tests, often known as Sats, in England this week, the emphasis is on mental arithmetic.

Calculators are no longer permitted.

Their use will also be limited in the new GCSE maths exams, for which students will start studying this autumn.
No dictionaries

Dictionaries have had a similarly chequered track record in foreign language exams.

They were banned 15 years ago, after research suggested they gave the brightest students a greater advantage.

Newly redrafted GCSEs in French, Spanish and German will be introduced in 2016.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Computers That Know How You Feel Will Soon Be Everywhere | WIRED

Computers That Know How You Feel Will Soon Be Everywhere | WIRED | Cyborg Lives | Scoop.it
Sometime next summer, you’ll be able to watch a horror series that is exactly as scary as you want it to be—no more, no less. You’ll pull up the show, which relies on software from the artificial intelligence startup Affectiva, and tap a button to opt in. Then, while you stare at your iPad, its camera will stare at you.

The software will read your emotional reactions to the show in real time. Should your mouth turn down a second too long or your eyes squeeze shut in fright, the plot will speed along. But if they grow large and hold your interest, the program will draw out the suspense. “Yes, the killing is going to happen, but whether you want to be kept in the tension depends on you,” says Julian McCrea, founder of the London-based studio Portal Entertainment, which has a development deal with a large unidentified entertainment network to produce the series. He calls Affectiva’s face-reading software, Affdex, “an incredible piece of technology.”
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Microsoft Shows HoloLens' Augmented Reality Is No Gimmick | WIRED

Microsoft Shows HoloLens' Augmented Reality Is No Gimmick | WIRED | Cyborg Lives | Scoop.it
Today, Microsoft demonstrated how far its augmented-reality HoloLens wonderland project has come. In fact, it cemented HoloLens’s place as one of the most exciting new technologies we have—just in ways that you may never actually see.

When HoloLens debuted in January, the use cases Microsoft proffered were largely domestic; you could build (Microsoft-owned) Minecraft worlds in your living room, or have conversations over (Microsoft-owned) Skype with far-flung friends who felt a few feet away. Even WIRED’s behind-the-scenes look back then mostly comprised games and other low-stakes living room interactions. While a broad range of industries and institutions have use for augmented reality, Microsoft spent the bulk of its HoloLens introduction emphasizing the device’s consumer potential.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

3D Printed Eyes with WiFi Connection « NextNature.net

3D Printed Eyes with WiFi Connection « NextNature.net | Cyborg Lives | Scoop.it
Nowadays 3D printing is increasingly used for medical purposes and body upgrades to design devices, implants, and a variety of customized prosthetics, from a 3D printed face, to a skull, and even organs.

In the future we may look at the world with new – artificial, 3D printed – eyes. Italian research studio MHOX is working on EYE, a 3D bioprinted sight augmentation. The project envisions the removal of the natural visual system and its replacement with a digitally designed 3D printed one. The original retina would be replaced by a new artificial network, able to offer enhanced vision, WiFi connection and the possibility to record video and take pictures.

In the hope of curing blindness and healing conditions, giving better vision, the 3D printed eyes are expected to be available by 2027.

The eyeballs will be constructed with the use of a bio-ink that contains the cells required to replace those found in natural eyes. Once the original pair of eyes is surgically removed, researchers plan on connecting the 3D printed one to a deck inside the head, which would allow the eyes to be inserted.

“We envision that the link between the deck and the EYE will be based on attractive forces between the tissues more than mechanical joints” MHOX designer Filippo Nassetti explains. “To replace the EYE the user only has to put it in position inside the skull, and the tissues of the Deck and the EYE connect automatically”.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

‘Spin-orbitronics’ could ‘revolutionize the electronics industry’ by manipulating magnetic domains | KurzweilAI

‘Spin-orbitronics’ could ‘revolutionize the electronics industry’ by manipulating magnetic domains | KurzweilAI | Cyborg Lives | Scoop.it
Researchers at the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) have found a new way of manipulating the walls that define magnetic domains (uniform areas in magnetic materials) and the results could one day revolutionize the electronics industry, they say.

Gong Chen and Andreas Schmid, experts in electron microscopy with Berkeley Lab’s Materials Sciences Division, led the discovery of a technique by which the “spin textures” of magnetic domain walls in ultrathin magnets can be switched between left-handed, right-handed, cycloidal, helical and mixed structures.

Electronic memory and logic

The “handedness” or “chirality” of spin texture determines the movement of a magnetic domain wall in response to an electric current, so this technique, which involves the strategic application of uniaxial strain, should lend itself to the creation of domains walls designed for desired electronic memory and logic functions.

“The information sloshing around today’s Internet is essentially a cacophony of magnetic domain walls being pushed around within the magnetic films of memory devices,” says Schmid. “Writing and reading information today involves mechanical processes that limit reliability and speed. Our findings pave the way to use the spin-orbit forces that act upon electrons in a current to propel magnetic domain walls either in the same direction as the current, or in the opposite direction, or even sideways, opening up a rich new smorgasbord of possibilities in the field of spin-orbitronics.”

The study was carried out at at the National Center for Electron Microscopy (NCEM), which is part of the Molecular Foundry, a DOE Office of Science User Facility. The results have been reported in a Nature Communications paper titled “Unlocking Bloch-type chirality in ultrathin magnets through uniaxial strain.” Chen and Schmid are the corresponding authors. Other co-authors are Alpha N’Diaye, Sang Pyo Kang, Hee Young Kwon, Changyeon Won, Yizheng Wu and Z.Q. Qiu.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

This Is What a World Without Language Barriers Would Look Like

This Is What a World Without Language Barriers Would Look Like | Cyborg Lives | Scoop.it
My dad has a story he likes to tell about one of his friends, a scientist. The scientist was giving a lecture in Japan, and opened with a joke that lasted a couple minutes. After delivering the joke in English, he waited for his translator to relay it to the audience. The translator spoke for only a few seconds, and then the crowd burst out laughing.

After the presentation was over, the scientist asked the translator how she managed to distill the humor of his joke down into such a concise form. She shrugged and said, “I said that the American visitor just told a very funny joke, and that they should all laugh now.”

The scientist's story illustrates the subjective, human quality of translation. Moving between languages is rarely a matter of transposing literal meaning; it requires the constant triage of unexpected inputs, endless judgment calls, and some social awareness. In other words, it’s something that humans are cut out for, and that computers are not.
more...
No comment yet.
Scooped by Wildcat2030
Scoop.it!

Inkjet-printed liquid metal could lead to new wearable tech, soft robotics | KurzweilAI

Inkjet-printed liquid metal could lead to new wearable tech, soft robotics | KurzweilAI | Cyborg Lives | Scoop.it
Purdue University researchers have developed a potential manufacturing method called “mechanically sintered gallium-indium nanoparticles” that can inkjet-print flexible, stretchable conductors onto anything — including elastic materials and fabrics — and can mass-produce electronic circuits made of liquid-metal alloys for “soft robots” and flexible electronics.

The method uses ultrasound to break up liquid metal into nanoparticles in ethanol solvent to make ink that is compatible with inkjet printing.

Elastic technologies could make possible a new class of pliable robots and stretchable garments that people might wear to interact with computers or for therapeutic purposes.

“Liquid metal in its native form is not inkjet-able,” said Rebecca Kramer, an assistant professor of mechanical engineering at Purdue. “So what we do is create gallium-indium liquid metal nanoparticles that are small enough to pass through an inkjet nozzle.

“Sonicating [using ultrasound] liquid metal in a carrier solvent, such as ethanol, both creates the nanoparticles and disperses them in the solvent. Then we can print the ink onto any substrate. The ethanol evaporates away so we are just left with liquid metal nanoparticles on a surface.”

After printing, the nanoparticles must be rejoined by applying light pressure, which renders the material conductive. This step is necessary because the liquid-metal nanoparticles are initially coated with oxidized gallium, which acts as a skin that prevents electrical conductivity.

“But it’s a fragile skin, so when you apply pressure it breaks the skin and everything coalesces into one uniform film,” Kramer said. “We can do this either by stamping or by dragging something across the surface, such as the sharp edge of a silicon tip.”
more...
No comment yet.