Cyborg Lives
17.7K views | +7 today
Scooped by Wildcat2030
onto Cyborg Lives!

The Robot Will See You Now-IBM's Watson—the same machine that beat Ken Jennings at Jeopardy—is now churning through case histories at Memorial Sloan-Kettering, learning to make diagnoses and treatm...

The Robot Will See You Now-IBM's Watson—the same machine that beat Ken Jennings at Jeopardy—is now churning through case histories at Memorial Sloan-Kettering, learning to make diagnoses and treatm... | Cyborg Lives |
IBM's Watson—the same machine that beat Ken Jennings at Jeopardy—is now churning through case histories at Memorial Sloan-Kettering, learning to make diagnoses and treatment recommendations.

Charley lukov didn’t need a miracle. He just needed the right diagnosis. Lukov, a 62-year-old from central New Jersey, had stopped smoking 10 years earlier—fulfilling a promise he’d made to his daughter, after she gave birth to his first grandchild. But decades of cigarettes had taken their toll. Lukov had adenocarcinoma, a common cancer of the lung, and it had spread to his liver. The oncologist ordered a biopsy, testing a surgically removed sample of the tumor to search for particular “driver” mutations. A driver mutation is a specific genetic defect that causes cells to reproduce uncontrollably, interfering with bodily functions and devouring organs. Think of an on/off switch stuck in the “on” direction. With lung cancer, doctors typically test for mutations called EGFR and ALK, in part because those two respond well to specially targeted treatments. But the tests are a long shot: although EGFR and ALK are the two driver mutations doctors typically see with lung cancer, even they are relatively uncommon. When Lukov’s cancer tested negative for both, the oncologist prepared to start a standard chemotherapy regimen—even though it meant the side effects would be worse and the prospects of success slimmer than might be expected using a targeted agent.

But Lukov’s true medical condition wasn’t quite so grim. The tumor did have a driver—a third mutation few oncologists test for in this type of case. It’s called KRAS. Researchers have known about KRAS for a long time, but only recently have they realized that it can be the driver mutation in metastatic lung cancer—and that, in those cases, it responds to the same drugs that turn it off in other tumors. A doctor familiar with both Lukov’s specific medical history and the very latest research might know to make the connection—to add one more biomarker test, for KRAS, and then to find a clinical trial testing the efficacy of KRAS treatments on lung cancer. But the national treatment guidelines for lung cancer don’t recommend such action, and few physicians, however conscientious, would think to do these things.

No comment yet.
Cyborg Lives
Understanding our Cyborg lives, redescribing our reality
Curated by Wildcat2030
Your new post is loading...
Your new post is loading...
Scooped by Wildcat2030!

People's Deepest, Darkest Google Searches Are Being Used Against Them

People's Deepest, Darkest Google Searches Are Being Used Against Them | Cyborg Lives |
Google knows the questions that people wouldn’t dare ask aloud, and it silently offers reams of answers. But it is a mistake to think of a search engine as an oracle for anonymous queries. It isn’t. Not even close.

In some cases, the most intimate questions a person is asking—about health worries, relationship woes, financial hardship—are the ones that set off a chain reaction that can have troubling consequences both online and offline.

All this is because being online increasingly means being put into categories based on a socioeconomic portrait of you that’s built over time by advertisers and search engines collecting your data—a portrait that data brokers buy and sell, but that you cannot control or even see. (Not if you’re in the United States, anyway.)

Consider, for example, a person who googles “need rent money fast” or “can’t pay rent.” Among the search results that Google returns, there may be ads that promise to help provide payday loans—ads designed to circumvent Google’s policies against predatory financial advertising. They’re placed by companies called lead generators, and they work by collecting and distributing personal information about consumers online. So while Google says it bans ads that guarantee foreclosure prevention or promise short-term loans without conveying accurate loan terms, lead generators may direct consumers to a landing page where they’re asked to input sensitive identifiable information. Then, payday lenders buy that information from the lead generators and, in some cases, target those consumers—online, via phone, and by mail—for the very sorts of short-term loans that Google prohibits.
No comment yet.
Scooped by Wildcat2030!

Facebook's 'On This Day' now lets you edit out bad memories (Wired UK)

Facebook's 'On This Day' now lets you edit out bad memories (Wired UK) | Cyborg Lives |
Facebook has added options to its On This Day tool, giving users more control over things they'd rather not have dredged up from their past.

The new preferences tab lets users edit memories and decide which people, events and time periods the social network should serve up -- and which it should leave well and truly in the past.

Since launching in March, On This Day has come under fire for casually forcing its users to relive sad, and sometimes even traumatic, memories they thought they'd left behind. But now Facebook seems to be taking cues from Eternal Sunshine of the Spotless Mind by giving its users more control over what they see.
No comment yet.
Scooped by Wildcat2030!

Digital dependence 'eroding human memory' - BBC News

Digital dependence 'eroding human memory' - BBC News | Cyborg Lives |
An over-reliance on using computers and search engines is weakening people's memories, according to a study.

It showed many people use computers instead of memorising information.

Many adults who could still recall their phone numbers from childhood could not remember their current work number or numbers of family members.

Maria Wimber from the University of Birmingham said the trend of looking up information "prevents the build-up of long-term memories".

The study, examining the memory habits of 6,000 adults in the UK, France, Germany, Italy, Spain, Belgium, the Netherlands and Luxembourg, found more than a third would turn first to computers to recall information.

The UK had the highest level, with more than half "searching online for the answer first".
Outsourcing memory

But the survey suggests relying on a computer in this way has a long-term impact on the development of memories, because such push-button information can often be immediately forgotten.

"Our brain appears to strengthen a memory each time we recall it, and at the same time forget irrelevant memories that are distracting us," said Dr Wimber.
No comment yet.
Scooped by Wildcat2030!

Stop Googling. Let’s Talk.

Stop Googling. Let’s Talk. | Cyborg Lives |
COLLEGE students tell me they know how to look someone in the eye and type on their phones at the same time, their split attention undetected. They say it’s a skill they mastered in middle school when they wanted to text in class without getting caught. Now they use it when they want to be both with their friends and, as some put it, “elsewhere.”

These days, we feel less of a need to hide the fact that we are dividing our attention. In a 2015 study by the Pew Research Center, 89 percent of cellphone owners said they had used their phones during the last social gathering they attended. But they weren’t happy about it; 82 percent of adults felt that the way they used their phones in social settings hurt the conversation.

I’ve been studying the psychology of online connectivity for more than 30 years. For the past five, I’ve had a special focus: What has happened to face-to-face conversation in a world where so many people say they would rather text than talk? I’ve looked at families, friendships and romance. I’ve studied schools, universities and workplaces. When college students explain to me how dividing their attention plays out in the dining hall, some refer to a “rule of three.” In a conversation among five or six people at dinner, you have to check that three people are paying attention — heads up — before you give yourself permission to look down at your phone. So conversation proceeds, but with different people having their heads up at different times. The effect is what you would expect: Conversation is kept relatively light, on topics where people feel they can drop in and out.
No comment yet.
Scooped by Wildcat2030!

Claws-on with the self-balancing Miposaur robot

Claws-on with the self-balancing Miposaur robot | Cyborg Lives |
We first saw WowWee's Miposaur robot at the London Toy Fair in January where it was self-balancing on two wheels similar to its older android sibling MiP. We recently tested out this T-rex's new features, which include an indoor GPS system for its TrackBall, a new phone app that extends the robot's abilities, and backwards compatibility with the old MiP to duke it out, virtual-laser-style.
No comment yet.
Scooped by Wildcat2030!

Intelligent Machines: AI art is taking on the experts - BBC News

Intelligent Machines: AI art is taking on the experts - BBC News | Cyborg Lives |
In a world where machines can do many things as well as humans, one would like to hope there remain enclaves of human endeavour to which they simply cannot aspire.

Art, literature, poetry, music - surely a mere computer without world experience, moods, memories and downright human fallibility cannot create these.

Meet Aaron, a computer program that has been painting since the 1970s - big dramatic, colourful pieces that would not look out of place in a gallery.

The "paintings" Aaron does are realised mainly via a computer program and created on a screen although, when his work began being exhibited, a painting machine was constructed to support the program with real brushes and paint.

Aaron does not work alone of course. His painting companion is Harold Cohen, who has "spent half my life trying to get a computer program to do what only rather talented human beings can do".

A painter himself, he became interested in programming in the late 1960s at the same time as he was pondering his own art and asking whether it was possible to devise a set of rules and then "almost without thinking" make the painting by following the rules.

The programming behind Aaron - written in LISP, which was invented by one of the founding fathers of artificial intelligence, John McCarthy, back in the 1960s - attempts to do just that.

Some of Aaron's knowledge is about the position of body parts and how they fit together, while some of the other rules are decided by the machine.

It actually "knows" very little about the world - it recognises the shape of people, potted plants, trees and simple objects such as boxes and tables. Instead of teaching it ever more things, Mr Cohen has concentrated on making it "draw better".

And it has been a great pupil.

"The machine had become a world-class colourist - it was much more adventurous in terms of colour than I was," he told the BBC.
No comment yet.
Scooped by Wildcat2030!

The Struggle To Define What Artificial Intelligence Actually Means

The Struggle To Define What Artificial Intelligence Actually Means | Cyborg Lives |
When we talk about artificial intelligence (AI) – which we have done lot recently, including my outline on The Conversation of liability and regulation issues – what do we actually mean?

AI experts and philosophers are beavering away on the issue. But having a usable definition of AI – and soon – is vital for regulation and governance because laws and policies simply will not operate without one.

This definition problem crops up in all regulatory contexts, from ensuring truthful use of the term “AI” in product advertising right through to establishing how next-generation automated weapons systems (AWSs) are treated under the laws of war.

True, we may eventually need more than one definition (just as “goodwill” means different things in different contexts). But we have to start somewhere so, in the absence of a regulatory definition at the moment, let’s get the ball rolling.
Defining the terms: artificial and intelligence

For regulatory purposes, “artificial” is, hopefully, the easy bit. It can simply mean “not occurring in nature or not occurring in the same form in nature”. Here, the alternative given after the “or” allows for the possible future use of modified biological materials.

This, then, leaves the knottier problem of “intelligence”.

From a philosophical perspective, “intelligence” is a vast minefield, especially if treated as including one or more of “consciousness”, “thought”, “free will” and “mind”. Although traceable back to at least Aristotle’s time, profound arguments on these Big Four concepts still swirl around us.

In 2014, seeking to move matters forward, Dmitry Volkov, a Russian technology billionaire, convened a summit on board a yacht of leading philosophers, including Daniel Dennett, Paul Churchland, and David Chalmers.

Perhaps unsurprisingly, no consensus was reached, and Chalmers suggested that it was unlikely to emerge within the next century.

Fortunately for would-be regulators, though, the philosophical arguments might be sidestepped, at least for a while. Let’s take a step back and ask what a regulator’s immediate interest is here?

I would say that it is the work products of AI scientists and engineers, and any public welfare or safety risks that might arise from those products.

Logically, then, it is the way that the majority of AI scientists and engineers treat “intelligence” that is of most immediate concern.
No comment yet.
Scooped by Wildcat2030!

The Holy Grail: Machine Learning + Extreme Robotics | KurzweilAI

Two experts on robotics and machine learning will reveal breakthrough developments in humanlike robots and machine learning at the annual SXSW conference in Austin next March, in a proposed* panel called “The Holy Grail: Machine Learning + Extreme Robotics.”

Participants will interact with Hanson Robotics’ forthcoming state-of-the-art female Sophia robot as a participant on the panel as she spontaneously tracks human faces, listens to speech, and generates a natural-language response while participating in dialogue about the potential of genius machines.

This conversation on the future of advanced robotics combined with machine learning and cognitive science will feature visionary Hanson Robotics founder/CEO David Hanson and Microsoft executive Jim Kankanias, who heads Program Management for Information Management and Machine Learning in the Cloud + Enterprise Division at Microsoft. The panel will be moderated by Hanson Robotics consultant Eric Shuss.

Stay tuned here for updates.
No comment yet.
Scooped by Wildcat2030!

Your Dreams of Using Nothing But Emoji Are Realized

Your Dreams of Using Nothing But Emoji Are Realized | Cyborg Lives |
While your smartphone is an easy conduit for all-emoji conversation, things get a little tough when you find yourself at a regular old laptop. Sure, keyboard shortcuts can get you there, but PC-made discussion is still dominated by… you know, words. Until now! Emoji Key is a set of stickers you can throw on top of your lettered keyboard. Then you just have to install the emoji keyboard on your laptop (the site includes instructions), and boom: You are typing in nothing but emoji. And yes, this would probably get confusing eventually.
No comment yet.
Scooped by Wildcat2030!

Open Bionics robotic hand for amputees wins Dyson Award - BBC News

Open Bionics robotic hand for amputees wins Dyson Award - BBC News | Cyborg Lives |
A prototype 3D-printed robotic hand that can be made faster and more cheaply than current alternatives is this year's UK winner of the James Dyson Award.

The Bristol-raised creator of the Open Bionics project says he can 3D-scan an amputee and build them a custom-fitted socket and hand in less than two days.

It typically takes weeks or months to obtain existing products.

Joel Gibbard says he aims to start selling the prosthetics next year.

"We have a device at the lower-end of the pricing scale and the upper end of functionality," he told the BBC.

"At the same time it is very lightweight and it can be customised for each person.

"The hand is basically a skeleton with a 'skin' on top. So, we can do different things to the skin - we can put patterns on it, we can change the styling and design. There's quite a lot of flexibility there."

The 25-year-old inventor intends to charge customers £2,000 for the device, including the cost of a fitting.

Although prosthetic arms fitted with hooks typically can be bought for similar prices, ones with controllable fingers are usually sold for between £20,000 and £60,000.
No comment yet.
Scooped by Wildcat2030!

BMW developing high-tech racing wheelchair for 2016 Paralympic Games

BMW developing high-tech racing wheelchair for 2016 Paralympic Games | Cyborg Lives |
Besides building luxury cars and motorcycles, BMW has made some pretty impressive sports gear, including an Olympic bobsled that drove Team USA to men's bronze and women's silver and bronze medals at the 2014 Sochi Olympics. BMW of North America announced today that it is now focusing attention on the upcoming 2016 Paralympic Games. It's reaching into its deep well of mechanical know-how to develop a racing wheelchair for the US track and field team.

The new racing chair represents BMW's fourth project in a six-year agreement with Team USA, following the bobsled and performance-tracking systems for both swimming and track and field. As in those past designs, the new project will involve adapting BMW vehicle technologies to the world of sports, and BMW will rely on its global creative consultancy Designworks in creating the new design. The California-based team will work directly with US track and field team athletes and coaches in identifying needs and solutions.

The project is still evolving, but BMW says that it will involve a complete overhaul of current racing wheelchair chassis design. The automaker will use its expertise in areas like aerodynamics, steering and braking, occupant restraint and carbon fiber construction in developing the new chair.
No comment yet.
Scooped by Wildcat2030!

A precision brain-controlled prosthesis nearly as good as one-finger typing | KurzweilAI

A precision brain-controlled prosthesis nearly as good as one-finger typing | KurzweilAI | Cyborg Lives |
An interdisciplinary team led by Stanford electrical engineer Krishna Shenoy has developed a technique to improve brain-controlled prostheses. These brain-computer-interface (BCI) devices, for people with neurological disease or spinal cord injury, deliver thought commands to devices such as virtual keypads, bypassing the damaged area.

The new technique addresses a problem with these brain-controlled prostheses: they currently access a sample of only a few hundred neurons, so tiny errors in the sample — neurons that fire too fast or too slow — reduce the precision and speed of thought-controlled keypads.

Understanding brain dynamics for arm movements

In essence the new prostheses analyze the neuron sample and quickly make dozens of corrective adjustments to the estimate of the brain’s electrical pattern.

Shenoy’s team tested a brain-controlled cursor meant to operate a virtual keyboard. The system is intended for people with paralysis and amyotrophic lateral sclerosis (ALS), also called Lou Gehrig’s disease, a condition that Stephen Hawking has. ALS degrades one’s ability to move.

The new corrective technique is based on a recently discovered understanding of how monkeys naturally perform arm movements. The researchers studied animals that were normal in every way. The monkeys used their arms, hands and fingers to reach for targets presented on a video screen. The researchers sought to learn, through hundreds of experiments, what the electrical patterns from the 100- to 200-neuron sample looked like during a normal reach — to understand the “brain dynamics” underlying reaching arm movements.

“These brain dynamics are analogous to rules that characterize the interactions of the millions of neurons that control motions,” said Jonathan Kao, a doctoral student in electrical engineering and first author of the open-access Nature Communications paper on the research. “They enable us to use a tiny sample more precisely.”
No comment yet.
Scooped by Wildcat2030!

Robot can leap from water's surface - BBC News

Robot can leap from water's surface - BBC News | Cyborg Lives |
Scientists have developed a tiny robot - based on the water strider insect - that can jump on water.

The robotic version uses the same forces to jump as the water strider - pushing off without breaking the surface.

It takes off with a downward force that never exceeds the surface tension of water - the force that "glues" surface water molecules together.

The South Korea and US team's advance is reported in the journal Science.

Lead researchers Prof Ho-Young Kim and Prof Kyu-Jin Cho, from Seoul National University, used water striders from their local pond in the study.

"To explore [their] amazing semi-aquatic motility, we collected [the insects] and recorded them jumping on water in the laboratory with high-speed cameras," the scientists said.

"[These imaging experiments] revealed that the insect rises upward while pushing the water surface downward and closing four of its legs inward."
No comment yet.
Scooped by Wildcat2030!

Google's neural network to compose email responses for you

Google's neural network to compose email responses for you | Cyborg Lives |
Gmail users are set to benefit from Google's machine learning research with Smart Reply. The system will use a deep neural network to not only analyze incoming emails for what information is required to form an appropriate response, but to propose three likely replies, with the end result enabling mobile users to respond quickly to emails.

Google's research blog details the initial challenge and the science that went into creating the technology. Crucial is a concept called sequence-to-sequence learning, already used in Google translation and a chatbot the search giant released earlier this year.

In sequence-to-sequence learning, two neural networks fuse both understanding a language and synthesizing language. The decoding network creates a thought vector by transcribing each word individually into a number, based on its context within the rest of the text. This grants the network the "idea" of the email.

The encoding network then generates potential responses, obviously not knowing which the human user might be partial to, but all of them making sense in the context of the decoded message and presenting suitable alternatives.
No comment yet.
Scooped by Wildcat2030!

A realistic bio-inspired robotic finger | KurzweilAI

A realistic bio-inspired robotic finger | KurzweilAI | Cyborg Lives |
A realistic 3D-printed robotic finger using a shape memory alloy (SMA) and a unique thermal training technique has been developed by Florida Atlantic University assistant professor Erik Engeberg, Ph.D.

“We have been able to thermomechanically train our robotic finger to mimic the motions of a human finger, like flexion and extension,” said Engeberg. “Because of its light weight, dexterity and strength, our robotic design offers tremendous advantages over traditional mechanisms, and could ultimately be adapted for use as a prosthetic device, such as on a prosthetic hand.”

Most robotic parts used today are rigid, have a limited range of motion and don’t look lifelike.

In the study, described in an open-access article in the journal Bioinspiration & Biomimetics, Engeberg and his team used a resistive heating process called “Joule” heating that involves the passage of electric currents through a conductor that releases heat.

How to create a robotic finger

The researchers first downloaded a 3-D computer-aided design (CAD) model of a human finger from the Autodesk 123D website (under creative commons license).
With a 3-D printer, they created the inner and outer molds that housed a flexor and extensor actuator and a position sensor. The extensor actuator takes a straight shape when it’s heated and the flexor actuator takes a curved shape when heated.
They used SMA plates and a multi-stage casting process to assemble the finger.
Electric currents flow through each SMA actuator from an electric power source at the base of the finger as a heating and cooling process to operate the robotic finger.

Results from the study showed a rapid flexing and extending motion of the finger and ability to recover its trained shape accurately and completely, confirming the biomechanical basis of its trained shape.
No comment yet.
Scooped by Wildcat2030!

Teen romance usually digitally enhanced, says US study - BBC News

Teen romance usually digitally enhanced, says US study - BBC News | Cyborg Lives |
Technology plays a key role in teenage romance from initial encounters to eventual break-ups, says a US study.

Teenagers rarely meet online but do use technology for flirting, asking out, meeting up and parting, American think tank, the Pew Research Center, found.

A survey of 1,060 US teenagers aged 13 to 17 revealed that technology brings them closer but also breeds jealousy.

"Digital platforms are powerful tools for teens," said Amanda Lenhart, lead author of the report from Pew.

"But even as teens enjoy greater closeness with partners and a chance to display their relationships for others to see, mobile and social media can also be tools for jealousy, meddling and even troubling behaviour."
Digital romance, broken down

Of the 1,060 teenagers surveyed:

35% said they were currently dating and 59% of that group said technology made them feel closer to their partner
For boys who were dating, 65% said social media made them more connected to a significant other while it was 52% for girls
27% of dating teenagers thought social media made them feel jealous or insecure in relationships
50% of all teens surveyed, dating or not, said they had indicated interest by friending someone on Facebook or other social media and 47% expressed attraction by likes and comments
Texting is king - 92% of teens who were dating said they texted a partner, assuming the partner would check in with "great regularity"
Jealousy happens, but not as much as flirting does - 11% of teenage daters reported accessing a partner's online accounts and 16% reported having a partner asking them to de-friend someone

What gets discussed during all those frequent social media enabled check-ins?

According to the survey, it is mostly "funny stuff" followed by "things you're thinking about" as well as other information such as where they are and what their friends have been doing.

And forget having to meet up to resolve a conflict - 48% of dating teenagers said that could be done by texting or talking online.
No comment yet.
Scooped by Wildcat2030!

In the Future, How Will We Talk to Our Technology?

In the Future, How Will We Talk to Our Technology? | Cyborg Lives |
One of the best scenes from Larry David’s tour-de-neuroses Curb Your Enthusiasm opens with Larry sitting at a restaurant. As cheesy music plays, the camera pans out, revealing the guy at the table next to him. He’s sitting alone, but jabbering loudly, reminding someone we can’t see that “on no planet is a shoe caddy a good gift.”

Then comes the reveal: Cut to the other side of this joker’s head, and there’s his Bluetooth headset. Larry, tired of his crap, starts talking loudly to himself. Eventually he fights with the guy next to him, and then they both go back to complaining to the empty chairs in front of them. Jerks.

The episode aired in 2007. Mercifully, the “Bluedouche” problem went away for a while after that—it was replaced by people sitting in silence, staring into their screens, which is at least easier to sit next to. Things are changing again: As we become more reliant on Siri, Google Now, Cortana, and the world of virtual assistants and voice-based apps and platforms, we’re starting to talk to our phones again. But this time, it should be way better.

Right now, we really only had one way to talk to our gadgets: We tap a button, bring the bottom half of our phone to our mouth, and speak extra-clearly into it. But few believe that’s how it’ll always be—and they have plenty of pop culture examples of this future. The earbud from Her, the screens-everywhere world of Total Recall, or the computer in Star Trek. But mostly it’s the earbud from Her.

Everywhere you turn, there’s a company working on this kind of wireless, unobtrusive, forget-it’s-in-there earpiece. Bragi’s Dash is probably the most commonly-cited example, but there’s also the Pearbuds, the OwnPhones, the Motorola Hint, the HearNotes, the Earin buds, the Truebuds, and countless others from companies big and small.
No comment yet.
Scooped by Wildcat2030!

Intelligent Machines: Chatting with the bots - BBC News

Intelligent Machines: Chatting with the bots - BBC News | Cyborg Lives |
One of the ultimate aims of artificial intelligence is to create machines we can chat to.

A computer program that can be trusted with mundane tasks - booking our holiday, reminding us of dentist appointments and offering useful advice about where to eat - but also one that can discuss the weather and answer offbeat questions.

Alan Turing, one of the first computer scientists to think about artificial intelligence, devised a test to judge whether a machine was "thinking".

He suggested that if, after a typewritten conversation, a human was fooled into believing they had talked to another person rather than a computer program, the AI would be judged to have passed.

These days we chat to machines on a regular basis via our smart devices.

Whether it be Siri, Google Now or Cortana, most of us have a chatbot in our pockets.
No comment yet.
Scooped by Wildcat2030!

How robot auctions are shaking up digital advertising - BBC News

How robot auctions are shaking up digital advertising - BBC News | Cyborg Lives |
How long does it take to load a webpage? Half a second or less, depending on the speed of your internet connection?

Well, in that fraction of a second, something extraordinary is happening behind the scenes.

A hyper-fast auction is taking place, involving tens of thousands of media buyers bidding for the right to launch advertising on that page - ads targeted at you.

All this happens automatically in just a few milliseconds.

It's called programmatic advertising - a catch-all term for the automation of many aspects of digital advertising - and it's making advertisers think twice about how they spend their money.

"We get an opportunity to bid on your eyeballs," explains Martin Kelly, chief executive of Infectious Media, a programmatic advertising specialist.

"We'll know what type of computer you're using, where you are roughly, the bit of the website you're looking at, but we won't have any personally identifiable information.
No comment yet.
Scooped by Wildcat2030!

"Autobiographical memory" lets robots act as knowledge go-betweens for ISS crews

"Autobiographical memory" lets robots act as knowledge go-betweens for ISS crews | Cyborg Lives |
Anyone who's had to take on job responsibilities from someone who left the company months ago will appreciate this robotic system designed with the International Space Station (ISS) in mind. With the design challenge of retaining important experiential information between rotating crews of astronauts, French researchers used the popular Nao robot to form an "autobiographical memory" of human interactions and pass on the know-how to new crew members.

Led by Peter Ford Dominey, the team at the French Institute of Health and Medical Research chose the Nao humanoid robot because its programmable platform makes it one of the most evolved robots available on the market. With the system, humans can teach the Nao new actions through directly manipulating its joints, allowing it to mimic them by capturing their movements via a Kinect, or using voice commands. The Nao stores these interactions along with the context, such as who else was involved, when it took place, and a video of the demonstration.

In a video of sample interactions (shown below), a technician first teaches the Nao some basic interactions, like how to hold a smart card, and creates a plan for repair by syncing up commands and instructions for the Nao, including tipping its head forward to record the interaction. He then proceeds through a sample repair with Nao's assistance.
No comment yet.
Scooped by Wildcat2030!

YouTube as you know it is about to change dramatically

The way you experience YouTube may be dramatically different before the end of the year. According to multiple sources, the world’s largest video-sharing site is preparing to launch its two separate subscription services before the end of 2015 — Music Key, which has been in beta since last November, and another unnamed service targeting YouTube’s premium content creators, which will come with a paywall. Taken together, YouTube will be a mix of free, ad-supported content and premium videos that sit behind a paywall.

With the exception of a few video rentals, YouTube has always been a free, ad-supported service. But the company is about to get serious about subscription services, offering new ways for the users that create videos to make money. While two subscription offerings for the same service might seem odd to some — with one music industry source calling it "strange on top of strange" — YouTube’s thinking was likened to that of a cable company offering different packages for sports and movies.
No comment yet.
Scooped by Wildcat2030!

Robots Will Steal Our Jobs, But They’ll Give Us New Ones

Robots Will Steal Our Jobs, But They’ll Give Us New Ones | Cyborg Lives |
At the Dusseldorf airport, robotic valet parking is now reality. You step out of your car. You press a button on a touch screen. And then a machine lifts your car off the ground, moving all three tons of it into a kind of aerial parking bay. Built by a German company called Serva Transport, the system saves you time. It saves garage space, thanks to those carefully arranged parking spots. And it’s a sign of so many things to come.

But the one thing it doesn’t do, says J.P. Gownder, an analyst with the Boston-based tech research firm Forrester, is steal jobs. In fact, it creates them. Before installing the robotic system, the airport already used automatic ticket machines, so the system didn’t replace human cashiers. And now, humans are needed to maintain and repair all those robotic forklifts. “These are not white-collar jobs,” Gownder tells WIRED. “This is the evolution of the repair person. It’s harder to fix a robot than it is to fix a vending machine.”
No comment yet.
Scooped by Wildcat2030!

Bach or computer: Can you tell who made this music? - Futurity

Bach or computer: Can you tell who made this music? - Futurity | Cyborg Lives |
Automation can unnerve some people, and the automation of art has a special power to offend humanity’s view of itself as soulful: How could a thing without psychological or emotional states express itself with the spirit and feeling seemingly necessary for making music?

“Before I encountered any of this stuff, I probably would have had a similar reaction,” says Donya Quick, a graduate student at Yale University who developed a computer program that composes original music that sounds like it was created by a human. “It’s an adverse reaction to novelty, the same way people first reacted to synthesizers.”

In two separate tests, each involving more than 100 human subjects of varied musical experience, participants listened to 40 short musical phrases, some written by humans, others by computer programs, including Quick’s, which she calls Kulitta.

The subjects were asked to rate the musical phrases on a seven-point scale ranging from “absolutely human” to “absolutely computer.” In both tests, Kulitta’s compositions rated, on average, on the human side of the scale.
‘It really does work’

The late Paul Hudak, Quick’s dissertation adviser at Yale, organized a separate series of informal public demonstrations where he juxtaposed a musical phrase composed by Kulitta with a phrase by J.S. Bach, the 17th-century German musical genius famous for his cello suites, fugues and chorales. Hudak then challenged audience members to identify which was which; invariably, even some music sophisticates confused Kulitta’s phrase for work by Bach.
No comment yet.
Scooped by Wildcat2030!

This is how the world might look with a bionic eye - Futurity

This is how the world might look with a bionic eye - Futurity | Cyborg Lives |
Researchers used simulations to create short videos that mimic what vision would be like after two different types of sight recovery therapies. The results may be very different from what scientists and patients had previously assumed.

“This is the first visual simulation of restored sight in any realistic form,” says Ione Fine, associate professor of psychology at the University of Washington. “Now we can actually say, ‘This is what the world might look like if you had a retinal implant.'”

Fine says the goal of the project is to provide information about the quality of vision people can expect if they undergo sight restoration surgery, an invasive and costly procedure.

“This is a really difficult decision to make,” she says. “These devices involve long surgeries, and they don’t restore anything close to normal vision. The more information patients have, the better.”

For many of people who have vision problems, the vision loss occurs after light enters the eye and lands on the retina, a thin layer at the back of the eye that contains millions of nerve cells. Among those are cells called rods and cones, which convert light into electrical impulses that are transmitted to vision centers in the brain.

Loss of rods and cones is the primary cause of vision loss in diseases such as macular degeneration or retinitis pigmentosa.

But those diseases leave most remaining neurons within the retina relatively intact, and various technologies under development aim to restore vision by targeting the surviving cells.
Two promising technologies

This is a pivotal time for the industry, Fine says, with one company that has a device on the market and several others set to enter the market in the next five to 10 years.

Two of the most promising devices, she says, are electric prostheses, which enable vision by stimulating surviving cells with an array of electrodes placed on the retina, and optogenetics, which insert proteins into the surviving retinal cells to make them light-sensitive.
No comment yet.
Scooped by Wildcat2030!

I’m Quitting Social Media to Learn What I Actually Like

I’m Quitting Social Media to Learn What I Actually Like | Cyborg Lives |
Three years ago, I began taking August off social media. I wasn’t alone. That was the year everyone started writing about digital detoxes, smartphone-free summer camps, and Facebook cleanses. One writer at the Verge took a year’s vacation from the Internet.

I don’t seem to see those stories as much anymore. To figure out why, I decided to ask my 1,868 Facebook friends. I pulled up the site, but before I could properly articulate the question, I noticed a guy I met briefly five years ago had posted hiking photos from the same place I went hiking last week. We had both been in Oregon!! What a coincidence! I clicked on the photo and saw he’d been there with a woman I knew from high school. Well, how do they know each other? I clicked on her photo and up came a profile pic of three tiny children, all adorable. The youngest had a Brown University shirt on. A little bit of digging revealed that, in fact, her husband had gotten a job at my alma mater and they’d all moved to Providence. I’d learned so much in just five minutes, but what was it I’d wanted to know from Facebook?
No comment yet.