Amazing Science
Find tag "robotics"
521.2K views | +47 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

Biorobotics-inspired eye stabilizes robot’s flight, replaces inertial navigation system

Biorobotics-inspired eye stabilizes robot’s flight, replaces inertial navigation system | Amazing Science |

Biorobotics researchers have developed the first aerial robot able to fly over uneven terrain that is stabilized visually without an accelerometer.

Called BeeRotor, it adjusts its speed and avoids obstacles thanks to optic flow sensors inspired by insect vision. It can fly along a tunnel with uneven, moving walls without measuring either speed or altitude. The study was published on February 26 in the journal Bioinspiration & Biomimetics.

Aircraft, ships, and spacecraft currently use a complex inertial navigation system based on accelerometers and gyroscopesto continuously calculate position, orientation, and velocity without the need for external references (known as dead reckoning).

Researchers Fabien Expert and Franck Ruffier at the Institut des Sciences du Mouvement – Etienne-Jules Marey(CNRS/Aix-Marseille Université) decided to create simpler system,  inspired by winged insects. They created BeeRotor, a tethered flying robot able for the first time to adjust its speed and follow terrain with no accelerometer and without measuring speed or altitude, avoiding vertical obstacles in a tunnel with moving walls.

To achieve this, the researchers mimicked the ability of insects to use the passing landscape as they fly. This is known as “optic flow,” the principle you can observe when driving along a road: the view in front is fairly stable, but looking out to either side, the landscape passes by faster and faster, reaching a maximum at an angle of 90 degrees to the path of the vehicle.

To measure optic flow, BeeRotor is equipped with 24 photodiodes (functioning as pixels) distributed at the top and the bottom of its “eye.” This enables it to detect contrasts in the environment as well as their motion. As in insects, the speed at which a feature in the scenery moves from one pixel to another provides the angular velocity of the flow. When the flow increases, this means that either the robot’s speed is increasing or that the distance relative to obstacles is decreasing.

By way of a brain, BeeRotor has three feedback loops: altitude (following the floor or roof), speed (adapting to the size of the tunnel) and stabilization of the eye in relation to the local slope. This enables the robot to always obtain the best possible field of view, independently of its degree of pitch. That allows BeeRotor to avoid very steeply sloping obstacles (see video) no accelerometer and no measures of speed or altitude.

BeeRotor suggests a biologically plausible hypothesis to explain how insects can fly without an accelerometer: using cues from optic flow to remain stable via feedback loops. Optic flow sensors also have industrial applications: such as replacing heavy accelerometers for small robots and as an ultra-light backup system in the event of failure on a space mission.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Supporting the elderly: A caring robot with ‘emotions’ and memory

Supporting the elderly: A caring robot with ‘emotions’ and memory | Amazing Science |

Researchers at the University of Hertfordshire have developed a prototype of a social robot that supports independent living for the elderly, working in partnership with their relatives or carers.

Farshid Amirabdollahian, a senior lecturer in Adaptive Systems at the university, led a team of nine partner institutions from five European countries as part of the €4,825,492 project called ACCOMPANY (Acceptable Robotics Companions for Ageing Years).

“This project proved the feasibility of having companion technology, while also highlighting different important aspects such as empathy, emotion, social intelligence as well as ethics and its norm surrounding technology for independent living,” Amirabdollahian said.

Madison & Morgan's curator insight, February 11, 1:31 PM

This article is about a robot that can help the elderly in their daily life. The robot is capable of human emotions and has moral ethics. This shows the technological advances that Europe has and relates to economy.

olyvia Schaefer and Rachel Shaberman's curator insight, February 11, 5:09 PM

Europe Arts

Europe has many inventions that they have created, but the most interesting to me is the robot that has emotions and memory.  This robot is supposed to help the elderly with their careers and daily life.  The Europeans were able to create technology that has empathy,emotions, and social intelligence and is just a robot.  The Europeans were able to accomplish something amazing.

ToKTutor's curator insight, February 21, 12:06 PM

Title 5: If a robot can have emotion and memory, can it also be programmed to have instinctive judgment?

Scooped by Dr. Stefan Gruenwald!

What Happens to a Society when Robots Replace Workers?

What Happens to a Society when Robots Replace Workers? | Amazing Science |

The technologies of the past, by replacing human muscle, increased the value of human effort – and in the process drove rapid economic progress. Those of the future, by substituting for man’s senses and brain, will accelerate that process – but at the risk of creating millions of citizens who are simply unable to contribute economically, and with greater damage to an already declining middle class.

Estimates of general rates of technological progress are always imprecise, but it is fair to say that, in the past, progress came more slowly. Henry Adams, the historian, measured technological progress by the power generated from coal, and estimated that power output doubled every ten years between 1840 and 1900, a compounded rate of progress of about 7% per year. The reality was probably much less. For example, in 1848, the world record for rail speed reached60 miles per hour. A century later, commercial aircraft could carry passengers at speeds approaching 600 miles per hour, a rate of progress of only about 2% per year.

By contrast, progress today comes rapidly. Consider the numbers for information storage density in computer memory. Between 1960 and 2003, those densities increased by a factor of five million, at times progressing at a rate of 60% per year. At the same time, true to Moore’s Law, semiconductor technology has been progressing at a 40% rate for more than 50 years. These rates of progress are embedded in the creation of intelligent machines, from robots to automobiles to drones, that will soon dominate the global economy – and in the process drive down the value of human labor with astonishing speed.

This is why we will soon be looking at hordes of citizens of zero economic value. Figuring out how to deal with the impacts of this development will be the greatest challenge facing free market economies in this century. If you doubt the march of worker-replacing technology, look at Foxconn, the world’s largest contract manufacturer. It employs more than one million workers in China. In 2011, the company installed 10,000 robots, called Foxbots. Today, the company is installing them at a rate of 30,000 per year. Each robot costs about $20,000 and is used to perform routine jobs such as spraying, welding, and assembly. On June 26, 2013, Terry Gou, Foxconn’s CEO, told his annual meeting that “We have over one million workers. In the future we will add one million robotic workers.” This means, of course, that the company will avoid hiring those next million human workers.

Just imagine what a Foxbot will soon be able to do if Moore’s Law holds steady and we continue to see performance leaps of 40% per year. Baxter, a $22,000 robot that just got a software upgrade, is being produced in quantities of 500 per year. A few years from now, a much smarter Baxter produced in quantities of 10,000 might cost less than $5,000. At that price, even the lowest-paid workers in the least developed countries might not be able to compete.

Tomasz Bienko's curator insight, January 19, 12:29 PM

Przede wszystkim maszyny mogą zastąpić ludzi jako siłę roboczą, ale przecież ku temu między innymi prowadzone są badania i wprowadzane nowe technologie, widać to już teraz w mechanizacji poszczególnych sektorów gospodarki (np. rolnictwa). Człowiek stara się uprościć sobie życie, ale może zjeść własny ogon. To jest chyba bardziej bliższy problem z którym będziemy się musieli zmierzyć rozwijając dalej tę technologię, niż np. bardziej odległe zbuntowanie się sztucznej inteligencji. Biorąc pod uwagę jak Prawo Moore'a z roku na rok ulega modyfikacją, zmiany będzie można zaobserwować już niedługo i to właśnie rosnące bezrobocie może być problemem który dostrzeżemy jako pierwsi w rozwijaniu sztucznej inteligencji. Maszyna nie zastąpi człowieka we wszystkim, na wszystkich stanowiskach, lecz może to też tylko kwestia czasu?

Scooped by Dr. Stefan Gruenwald!

Delivery drones test successful in France

Delivery drones test successful in France | Amazing Science |

If pilot projects from companies like Bizzby and DHL Parcel are any indication, the skies of Europe could soon be buzzing with parcel delivery drones. GeoPost, the express delivery arm of French mail service La Poste, has now revealed that it undertook drone delivery testing at the Centre d'Etudes et d'Essais pour Modèles Autonomes (CEEMA) in September.

As part of its ongoing GeoDrone project, GeoPost partnered with Atechsys to develop an electric delivery drone capable of autonomously transporting a parcel up to dimensions of 40 x 30 x 20 cm (16 x 12 x 8 in) and 4 kg (9 lb) in weight within a 20 km (12 mile) radius. The project is looking at the use of drones to access isolated areas such as mountains, islands and rural areas, as well as providing a means of responding to emergency situations.

Demonstrating the possible use of drones in real world conditions, the test involved automated take-off, flight phase, landing and return to base. Unfortunately, GeoPost hasn't released any specs on the prototype itself but we can tell you that the 3.7 kg (8.2 lb) six-rotor prototype is reported to have successfully transported a 2 kg (4.4 lb) package over a distance of 1,200 m (about 4,000 ft) at the CEEMA site in the south of France.

Be-Bound®'s curator insight, January 5, 3:38 AM

Amazon tried it last year, now the French Mail service and soon many more will follow. The technology and the logistics are mastered, without the shadow of a doubt, however, now the big challenge will be traffic regulation and authorization.

Scooped by Dr. Stefan Gruenwald!

A Worm's Mind In A Lego Body: Scientists Map Brain Connectome of C.elegans and Upload it to a Lego Robot

A Worm's Mind In A Lego Body: Scientists Map Brain Connectome of C.elegans and Upload it to a Lego Robot | Amazing Science |

Take the connectome of a worm and transplant it as software in a Lego Mindstorms EV3 robot - what happens next? It is a deep and long standing philosophical question. Are we just the sum of our neural networks. Of course, if you work in AI you take the answer mostly for granted, but until someone builds a human brain and switches it on we really don't have a concrete example of the principle in action.

The nematode worm Caenorhabditis elegans (C. elegans) is tiny and only has 302 neurons. These have been completely mapped and the OpenWorm project is working to build a complete simulation of the worm in software. One of the founders of the OpenWorm project, Timothy Busbice, has taken the connectome and implemented an object oriented neuron program.

The model is accurate in its connections and makes use of UDP packets to fire neurons. If two neurons have three synaptic connections then when the first neuron fires a UDP packet is sent to the second neuron with the payload "3". The neurons are addressed by IP and port number. The system uses an integrate and fire algorithm. Each neuron sums the weights and fires if it exceeds a threshold. The accumulator is zeroed if no message arrives in a 200ms window or if the neuron fires. This is similar to what happens in the real neural network, but not exact.

The software works with sensors and effectors provided by a simple LEGO robot. The sensors are sampled every 100ms. For example, the sonar sensor on the robot is wired as the worm's nose. If anything comes within 20cm of the "nose" then UDP packets are sent to the sensory neurons in the network.

The same idea is applied to the 95 motor neurons but these are mapped from the two rows of muscles on the left and right to the left and right motors on the robot. The motor signals are accumulated and applied to control the speed of each motor.  The motor neurons can be excitatory or inhibitory and positive and negative weights are used. 

And the result? It is claimed that the robot behaved in ways that are similar to observed C. elegans. Stimulation of the nose stopped forward motion. Touching the anterior and posterior touch sensors made the robot move forward and back accordingly. Stimulating the food sensor made the robot move forward.

More Information: The Robotic Worm (Biocoder pdf - free on registration)
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Autonomous, human-sized security robots are almost here

Autonomous, human-sized security robots are almost here | Amazing Science |

As the sun set on a warm November afternoon, a quartet of five-foot-tall, 300-pound shiny white robots patrolled in front of Building 1 on Microsoft’s Silicon Valley campus. Looking like a crew of slickDaleks imbued with the grace of Fred Astaire, they whirred quietly across the concrete in different directions, stopping and turning in place so as to avoid running into trash cans, walls, and other obstacles.

The robots managed to appear both cute and intimidating. This friendly-but-not-too-friendly presence is meant to serve them well in jobs like monitoring corporate and college campuses, shopping malls, and schools.

Knightscope, a startup based in Mountain View, California, has been busy designing, building, and testing the robot, known as the K5, since 2013. Seven have been built so far, and the company plans to deploy four before the end of the year at an as-yet-unnamed technology company in the area. The robots are designed to detect anomalous behavior, such as someone walking through a building at night, and report back to a remote security center.

“This takes away the monotonous and sometimes dangerous work, and leaves the strategic work to law enforcement or private security, depending on the application,” Knightscope cofounder and vice president of sales and marketing Stacy Stephens said as a K5 glided nearby.

In order to do the kind of work a human security guard would normally do, the K5 uses cameras, sensors, navigation equipment, and electric motors—all packed into its dome-shaped body with a big rechargeable battery and a computer. There are four high-definition cameras (one on each side of the robot), a license-plate recognition camera, four microphones, and a weather sensor (which looks like a DVD-player slot) for measuring barometric pressure, carbon dioxide levels, and temperature. The robots use Wi-Fi or a wireless data network to communicate with each other and with people who can remotely monitor its cameras, microphones, and other sources of data.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Prototyping a biological drone, made from bacteria and fungi that melts away after use

Prototyping a biological drone, made from bacteria and fungi that melts away after use | Amazing Science |

What's stealthier than an ordinary drone? One that can disintegrate when it needs to, in order to destroy evidence of its spying activities. A team of researchers from various educational institutions and NASA Ames Research Center has developed a biodegradable drone made of mycelium (or the vegetative part of fungi), which recently completed its first flight. According to Lynn Rothschild of NASA Ames, once the drone, say, self-destroys by diving into a puddle, "No one would know if you'd spilled some sugar water or if there'd been an airplane there."

A New York company called Ecovative Design grew mycelia into a custom drone-shaped chassis you see above. Unfortunately, some parts of the drone just can't be replaced with biodegradable materials for now, though the team tried to stay true to the idea and used silver nanoparticle ink (which can disintegrate along with the chassis) to print the device's circuits. For the test flight earlier this month, the team had to use propellers, controls and batteries taken from an ordinary quadcopter, but that might change in the future. You can read all about the development process on the scientists' website, where you can also download some 3D printable files of a few drone chassis concepts.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Robotic underwater gliders reveal why the Antarctic ice sheet is melting 150 billion tons per year

Robotic underwater gliders reveal why the Antarctic ice sheet is melting 150 billion tons per year | Amazing Science |

At current rates, ice sheet loss will become the most significant  contributor to global sea level rise during this century, yet there is still a lot that scientists  don't know about the underlying causes. This is partly because Antarctica is such a difficult place to take measurements.

But now robotic underwater gliders are giving scientists new insight into why the Antarctic ice sheet is melting. An ice sheet is a huge layer of ice that sits on land. The two on the Earth today are found on Antarctica and Greenland, but in the last ice age there were also ice sheets on North America and northern Europe.

The Antarctic ice sheet spans more than 14 million square kilometers, which is roughly the same size as the US and Mexico put together. The ice sheet also spills out onto the surrounding ocean in the form of ice shelves.

The Intergovernmental Panel on Climate Change (IPCC)  estimates that the Antarctic ice sheet is currently losing around 150 billion tonnes of ice per year. One of the main areas of ice loss is from the Antarctic Peninsula, shown in the red rectangle in the map below.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

A brain-inspired chip lets drones learn during flight

A brain-inspired chip lets drones learn during flight | Amazing Science |

There isn’t much space between your ears, but what’s in there can do many things that a computer of the same size never could. Your brain is also vastly more energy efficient at interpreting the world visually or understanding speech than any computer system.

That’s why academic and corporate labs have been experimenting with “neuromorphic” chips modeled on features seen in brains. These chips have networks of “neurons” that communicate in spikes of electricity (see “Thinking in Silicon”). They can be significantly more energy-efficient than conventional chips, and some can even automatically reprogram themselves to learn new skills.

Now a neuromorphic chip has been untethered from the lab bench, and tested in a tiny drone aircraft that weighs less than 100 grams. In the experiment, the prototype chip, with 576 silicon neurons, took in data from the aircraft’s optical, ultrasound, and infrared sensors as it flew between three different rooms.

The first time the drone was flown into each room, the unique pattern of incoming sensor data from the walls, furniture, and other objects caused a pattern of electrical activity in the neurons that the chip had never experienced before. That triggered it to report that it was in a new space, and also caused the ways its neurons connected to one another to change, in a crude mimic of learning in a real brain. Those changes meant that next time the craft entered the same room, it recognized it and signaled as such.

The chip involved is far from ready for practical deployment, but the test offers empirical support for the ideas that have motivated research into neuromorphic chips, says Narayan Srinivasa, who leads HRL’s Center for Neural and Emergent Systems. “This shows it is possible to do learning literally on the fly, while under very strict size, weight, and power constraints,” he says.

No comment yet.
Rescooped by Dr. Stefan Gruenwald from Fragments of Science!

Cloud Robotics: The Plan to Build a Massive Online Brain for All the World’s Robots

Cloud Robotics: The Plan to Build a Massive Online Brain for All the World’s Robots | Amazing Science |

If you walk into the computer science building at Stanford University, Mobi is standing in the lobby, encased in glass. He looks a bit like a garbage can, with a rod for a neck and a camera for eyes. He was one of several robots developed at Stanford in the 1980s to study how machines might learn to navigate their environment—a stepping stone toward intelligent robots that could live and work alongside humans. He worked, but not especially well. The best he could do was follow a path along a wall. Like so many other robots, his “brain” was on the small side.

Now, just down the hall from Mobi, scientists led by roboticist Ashutosh Saxena are taking this mission several steps further. They’re working to build machines that can see, hear, comprehend natural language (both written and spoken), and develop an understanding of the world around them, in much the same way that people do.

Today, backed by funding from the National Science Foundation, the Office of Naval Research, Google, Microsoft, and Qualcomm, Saxena and his team unveiled what they call RoboBrain, a kind of online service packed with information and artificial intelligence software that any robot could tap into. Working alongside researchers at the University of California at Berkeley, Brown University, and Cornell University, they hope to create a massive online “brain” that can help all robots navigate and even understand the world around them. “The purpose,” says Saxena, who dreamed it all up, “is to build a very good knowledge graph—or a knowledge base—for robots to use.”

Any researcher anywhere will be able use the service wirelessly, for free, and transplant its knowledge to local robots. These robots, in turn, will feed what they learn back into the service, improving RoboBrain’s know-how. Then the cycle repeats.

These days, if you want a robot to serve coffee or carry packages across a room, you have to hand-code a new software program—or ask a fellow roboticist to share code that’s already been built. If you want to teach a robot a new task, you start all over. These programs, or apps, live on the robot itself, and that, Saxena says, is inefficient. It goes against all the current trends in tech and artificial intelligence, which seek to exploit the power of distributed systems, massive clusters of computers that can power devices over the net. But this is starting to change. RoboBrain is part of an emerging movement known as cloud robotics.

Via Mariaschnee
Tekrighter's curator insight, August 28, 2014 10:01 AM

One of the most perplexing problems in science today is efficient integration of disparate data repositories. This is a step in the right direction.

Scooped by Dr. Stefan Gruenwald!

A Better Hand: Multitasking Like Never Before With These Robotic Fingers

A Better Hand: Multitasking Like Never Before With These Robotic Fingers | Amazing Science |

Many hands make light work, right? Well, MIT researchers have created a wrist-worn robot with a couple of extra digits.

There are several explanations for why the human hand developed the way it has. Some researchers link our opposable thumbs to our ancestors’ need to club and hurl objects at enemies or throw a punch, while others say that a unique gene enhancer (a group of proteins in DNA that activate certain genes) is what led to our anatomy. But most agree that bipedalism, enlarged brains and the need to use tools are what did the trick.

Yet, for as dexterous as our hands make us, a team of researchers at the Massachusetts Institute of Technology think we can do better. Harry Asada, a professor of engineering, has developed a wrist-worn robot that will allow a person to peel a banana or open a bottle one-handed

Together with graduate student Faye Wu, Asada built a pair of robotic fingers that track, mimic and assist a person’s own five digits. The two extra appendages, which look like elongated plastic pointer fingers, attach to a wrist cuff and extend alongside the thumb and pinkie. The apparatus connects to a sensor-laden glove, which measures how a person’s fingers bend and move. An algorithm crunches that movement data and translates it into actions for each robotic finger.

The robot takes a lesson from the way our own five digits move. One control signal from the brain activates groups of muscles in the hand. This synergy, Wu explains in a video demonstration, is much more efficient than sending signals to individual muscles.

In order to map how the extra fingers would move, Wu attached the device to her wrist and began grabbing objects throughout the lab. With each test, she manually positioned the robot fingers onto an object in a way that would be most helpful—for example, steadying a soda bottle while she used her hand to untwist the top. In each instance, she recorded the angles of both her own fingers and those of her robot counterpart.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Origami Inspires the Rise of Self-Folding Robot

Origami Inspires the Rise of Self-Folding Robot | Amazing Science |
A creation made of composite paper can fold and assemble itself and start working without intervention. Such robots could be deployed cheaply and quickly.

An intricately cut sheet lies flat and motionless on a table. Then Samuel Felton, a graduate student at Harvard, connects the batteries, sending electricity coursing through, heating it. The sheet lurches to life, the pieces bending and folding into place. The transformation completes in four minutes, and the sheet, now a four-limbed robot, scurries away at more than two inches a second. The creation, reported Thursday in the journal Science, is the first robot that can fold itself and start working without any intervention from the operator. “We’re trying to make robots as quickly and cheaply as possible,” Mr. Felton said.

Inspired by origami, the Japanese paper-folding art, such robots could be deployed, for example, on future space missions, Mr. Felton said. Or perhaps the technology could one day be applied to Ikea-like furniture, folding from a flat-packed board to, say, a table without anyone fumbling with Allen wrenches or deciphering instructions seemingly rendered in hieroglyphics.

Mr. Felton’s sheet is not simple paper, but a composite made of layers of paper, a flexible circuit board and Shrinky Dinks — plastic sheets, sold as a toy, that shrink when heated above 212 degrees Fahrenheit. The researchers attached to the sheet two motors, two batteries and a microcontroller that served as the brain for the robot. Those components accounted for $80 of the $100 of materials needed for the robot. While the robot could fold itself, the sheet took a couple of hours for Mr. Felton to construct. Still, it was simpler and cheaper than the manufacturing process for most machines today — robots, iPhones, cars — which are made of many separate pieces that are then glued, bolted and snapped together.

Mr. Felton’s adviser, Robert J. Wood, a professor of engineering and applied sciences, was initially interested in building insect-size robots. But for machines that small, “there really are no manufacturing processes that are applicable,” Dr. Wood said.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

NASA's next Mars rover will make oxygen, to sustain life

NASA's next Mars rover will make oxygen, to sustain life | Amazing Science |

For 17 years, NASA rovers have laid down tire tracks on Mars. But details the space agency divulged this week about its next Martian exploration vehicle underscored NASA's ultimate goal. Footprints are to follow someday.

The last three rovers -- Spirit, Opportunity and Curiosity -- confirmed the Red Planet's ability to support life and searched for signs of past life. The Mars rover of the next decade will hone in on ways to sustain future life there, human life.

"The 2020 rover will help answer questions about the Martian environment that astronauts will face and test technologies they need before landing on, exploring and returning from the Red Planet," said NASA's William Gerstenmaier who works on human missions. This will include experiments that convert carbon dioxide in the Martian atmosphere into oxygen "for human respiration." Oxygen could also be used on Mars in making rocket fuel that would allow astronauts to refill their tanks.

The 2020 rover is the near spitting image of Curiosity and NASA's Jet Propulsion Laboratory announced plans to launch the new edition not long after Curiosity landed on Mars in 2012. But the 2020 rover has new and improved features. The Mars Oxygen ISRU Experiment, or MOXIE for short, is just one. There are super cameras that will send back 3D panoramic images and spectrometers that will analyze the chemical makeup of minerals with an apparent eye to farming.

"An ability to live off the Martian land would transform future exploration of the planet," NASA said in a statement. The 2020 rover will also create a job for a future mission to complete, once the technology emerges to return to Earth from Mars. It will collect soil samples to be sent back for lab analysis at NASA.

Eric Chan Wei Chiang's curator insight, August 2, 2014 10:57 PM

Oxygen production and minerals for farming would pave the way for manned missions and perhaps even a small colony.  The next step would be Mars sample return mission:


Read more scoops on Mars here:

Scooped by Dr. Stefan Gruenwald!

This New Four-Legged Robot Is Basically Invincible

This New Four-Legged Robot Is Basically Invincible | Amazing Science |

Boston Dynamics, the company that builds incredibly agile robots, has added another four-legged sprinter to its pack. In order to introduce the world to “Spot,” the crew at Boston Dynamic kicked the innocent robot as it walked through the halls of their building — and filmed it. However, as you can see in the YouTube video, Spot never falters under the abuse; it dynamically corrects its balance even after a good shove.

When you’re an advanced robotics builder owned by Google, you don’t have to do much to make a splash. Boston Dynamic’s video (clearly filmed before Snowmageddon) is simply called “Introducing Spot,” and it’s two minutes of the quadruped climbing stairs, walking up hills, and, of course, getting kicked. A four-sentence video description is the only additional information the company is providing about Spot:

  • Spot is a four-legged robot designed for indoor and outdoor operation. It is electrically powered and hydraulically actuated.
  • Spot has a sensor head that helps it navigate and negotiate rough terrain. Spot weighs about 160 lbs.
No comment yet.
Scooped by Dr. Stefan Gruenwald!

‘Cobots’ - robots that work side-by-side with humans - enhance robotic manufacturing and throughput

‘Cobots’ - robots that work side-by-side with humans - enhance robotic manufacturing and throughput | Amazing Science |

Manufacturers have begun experimenting with a new generation of “cobots” (collaborative robots) designed to work side-by-side with humans. To determine best practices for effectively integrating human-robot teams within manufacturing environments, a University of Wisconsin-Madison team headed by Bilge Mutlu, an assistant professor of computer sciences, is working with an MIT team headed by Julie A. Shah, an assistant professor of aeronautics and astronautics.

Their research is funded by a three-year grant from the National Science Foundation (NSF) as part of its National Robotics Initiative program.

Cobots are less expensive and intended to be easier to reprogram and integrate into manufacturing. For example, Steelcase owns four next-generation robots based on a platform called Baxter, made by Rethink Robotics.

Each Baxter robot has two arms and a tablet-like panel for “eyes” that provide cues to help human workers anticipate what the robot will do next.

“This new family of robotic technology will change how manufacturing is done,” says Mutlu. “New research can ease the transition of these robots into manufacturing by making human-robot collaboration better and more natural as they work together.”

Mutlu’s team is building on previous work related to topics such as gaze aversion in humanoid robots, robot gestures, and the issue of “speech and repair.” For example, if a human misunderstands a robot’s instructions or carries them out incorrectly, how should the robot correct the human?

On Rethink Robotics’ blog, founder and chairman Rodney Brooks notes “three exciting and significant trends taking place right now” that he thinks will begin to gain some very real traction in 2015:

  • We will begin to see large-scale deployment of collaborative and intelligent robots in manufacturing.
  • This will be a breakout year for robotics research.
  • Emerging technology will be designed to solve some of the world’s biggest problems.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The Chinese hotel that's staffed ENTIRELY by robots - all your needs are met by cyborgs at $11 per night

The Chinese hotel that's staffed ENTIRELY by robots - all your needs are met by cyborgs at $11 per night | Amazing Science |

A brand new hotel has skipped the recruitment process - because all its staff are robots. From reception desk staff to security doormen and waiters, the Pengheng Space Capsules Hotel in Shenzhen, China, has built, rather than hired, their new employees. Start-up costs and robot maintenance aside, staff bills must be minimal. And it seems the hotel is keen to pass their savings right back to the customer as a night's stay costs just £6.80 per night.

You can even order food and drinks from the lounge area using supplied tablet computers, and your choices arriving via robot waiter just a few minutes later. Best of all, however, is the price. A stay in the hotel costs a mere 70 yuan (£6.81) per person. For that price you might not expect much, but the hotel's facilities are impressive. This capsule hotel - a hotel format coined in Japan featuring many extremely small rooms dubbed capsules - combines affordable and minimalist overnight accommodation with a futuristic vibe.

A eerie blue-lit corridor leads to the space station-style bunks, which resemble something lifted out of a sci-fi film. The hotel also features banks of computers, lockers, washrooms and a DIY laundry room.

With robots, neon and shiny surfaces as far as the eye can see, a stay in the hotel has become a must for both local and visiting tech fans.

india cox's curator insight, May 5, 11:08 PM

This is an incredible way to revolutionize the hotel industry. To completely remove part of the HR process changes the way that hotels operate. I dont think this is a long term solution to streamlining the hotel industry. Rather than going through the long employment process this hotel builds employees. A cost effective way to operate a hotel. However, operating this way doesnt come without its negative aspects. As a guest you generally stay in a hotel to go on a holiday. This hotel is good for a short stay but isnt very appropriate for a long stay. Its more about the capsule hotel experience. The only thing that worries me is the fact that the hotel relies heavily on technology as a part of their service. Who do you call if you have a problem or the tablet breaks? You cant really get the robot to fix it for you. As well as their amenaties. For example,  their DIY laundry service isnt really relevant. When you go to a hotel you want to be on a holiday, you dont want to go and clean your own laundry. If they rely so heavily on technology, why cant you get the robots to do your laundry..?

I dont think this a long term solution and that the negatives out weigh the benefits. The lack of service doesnt really reflect positively on the hospitality industry. It seems cold rather than hospitable. I really dont think this is a good HR solution. Rather an unsustainable shortcut. 

In theory, its great. In practice, not so great. 

Scooped by Dr. Stefan Gruenwald!

New frontiers: Drones deliver a raft of surprises in 2014

New frontiers: Drones deliver a raft of surprises in 2014 | Amazing Science |
2014 wasn’t the year that drones first entered the consumer lexicon, but it did see the notion of using these unmanned vehicles to our advantage become much more palatable. Package deliveries and carrying out conventional robotic tasks are some concepts that have defined the progress of drones in the past 12 months, but, as is typical of emerging technologies, the more their potential is realized the more they find uses in unexpected new applications. Let’s have a look over some of the year’s more surprising, yet significant, drone projects that promise to shake things up in exciting new ways.
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Researchers print out self-learning robots

Researchers print out self-learning robots | Amazing Science |

When the robots of the future are set to extract minerals from other planets, they need to be both self-learning and self-repairing. Researchers at Oslo University have already succeeded in producing self-instructing robots on 3D printers.

“In the future, robots must be able to solve tasks in deep mines on distant planets, in radioactive disaster areas, in hazardous landslip areas and on the sea bed beneath the Antarctic. These environments are so extreme that no human being can cope. Everything needs to be automatically controlled. Imagine that the robot is entering the wreckage of a nuclear power plant. It finds a staircase that no-one has thought of. The robot takes a picture. The picture is analyzed. The arms of one of the robots is fitted with a printer. This produces a new robot, or a new part for the existing robot, which enables it to negotiate the stairs,” hopes Associate Professor Kyrre Glette who is part of the Robotics and intelligent systems research team at Oslo University’s Department of Informatics.

Even if Glette’s ideas remain visions of the future, the robotics team in the Informatics Building have already developed three generations of self-learning robots.

Professor Mats Høvin was the man behind the first model, the chicken-robot named “Henriette”, which received much media coverage when it was launched ten years ago. Henriette had to teach itself how to walk, and to jump over obstacles. And if it lost a leg, it had to learn, unaided, how to hop on the other leg.

A few years later, Masters student Tønnes Nygaard launched the second generation robot. At the same time, the Informatics team developed a simulation program that was able to calculate what the body should look like. Just as for Henriette, its number of legs was pre-determined, but the computer program was at liberty to design the length of the legs and the distance between them.

The third generation of robots brings even greater flexibility. The simulation programme takes care of the complete design and suggests the optimal number of legs and joints.

Simulation is not enough. In order to test the functionality of the robots, they need to undergo trials in the real world. The robots are produced as printouts from a 3D printer. “Once the robots have been printed, their real-world functionalities quite often prove to be different from those of the simulated versions. We are talking of a reality gap. There will always be differences. Perhaps the floor is more slippery in reality, meaning that the friction coefficient will have to be changed. We are therefore studying how the robots deteriorate from simulation to laboratory stage,” says Mats Høvin.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Wearable Competition Finalist: Wear a Spy Drone on Your Wrist

Wearable Competition Finalist: Wear a Spy Drone on Your Wrist | Amazing Science |

A drone that can be dispatched with the flick of a wrist feels like an invention likely to fly out from the Batcave, but a Stanford Ph.D. and a Google program manager are close to finalizing a quadcopter that can be worn like a slap bracelet.

Called Nixie, this diminutive drone weighs less than a tenth of a pound, but can capture HD images and sync with a smartphone while its owner is busy scaling an Alp or biking through the Teutoburg forest. “Quadcopters give you a new perspective you can’t get anywhere else,” says Jelena Jovanovic, Nixie’s project manager. “But it’s not really feasible to pilot a drone and keep doing what you’re doing.”

Being able to wear the drone is a cute gimmick, but it’s powerful software packed into a tiny shell could set Nixie apart from bargain Brookstone quadcopters. Expertise in motion-prediction algorithms and sensor fusion will give the wrist-worn whirlybirds an impressive range of functionality. A “Boomerang mode” allows Nixie to travel a fixed distance from its owner, take a photo, then return. “Panorama mode” takes aerial photos in a 360° arc. “Follow me” mode makes Nixie trail its owner and would capture amateur athletes in a perspective typically reserved for Madden all-stars. “Hover mode” gives any filmmaker easy access to impromptu jib shots. Other drones promise similar functionality, but none promise the same level of portability or user friendliness. “We’re not trying to build a quadcopter, we’re trying to build a personal photographer,” says Jovanovic.

Jovanovic and her partner Christoph Kohstall, a Stanford postdoc who holds a Ph.D. in quantum physics and a first-author credit in the journal Nature, believe photography is at a tipping point. Early cameras were bulky, expensive, and difficult to operate. The last hundred years have produced consistently smaller, cheaper, and easier-to-use cameras, but future developments are forking. Google Glass provides the ultimate in portability, but leaves wearers with a fixed perspective. Surveillance drones offer unique vantage points, but are difficult to operate. Nixie attempts to offer the best of both worlds. 

Nixie is an undeniably impressive concept, and while rough prototypes prove the principle, the question remains if its myriad design challenges can be solved without sacrificing the sleek look.

The team’s strong background suggests they can. As a teenager, Kohstall designed a telescope that could follow a point in the sky to take long exposure star photographs using bike frame parts and Lego motors before graduating to writing a treatise on Metastability and Coherence of Repulsive Polarons in a Strongly Interacting Fermi Mixture.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Microbot muscles: Chains of particles assemble and flex | (e) Science News

Microbot muscles: Chains of particles assemble and flex | (e) Science News | Amazing Science |

In a step toward robots smaller than a grain of sand, University of Michigan researchers have shown how chains of self-assembling particles could serve as electrically activated muscles in the tiny machines. So-called microbots would be handy in many areas, particularly medicine and manufacturing. But several challenges lie between current technologies and science fiction possibilities. Two of the big ones are building the 'bots and making them mobile.

"We are inspired by ideas of microscopic robots," said Michael Solomon, a professor of chemical engineering. "They could work together and go places that have never been possible before." Solomon and his group demonstrated that some gold plating and an alternating electric field can help oblong particles form chains that extend by roughly 36 percent when the electric field is on.

"What's really important in the field of nanotechnology right now is not just assembling into structures, but assembling into structures that can change or shape-shift," said Sharon Glotzer, the Stuart W. Churchill Professor of Chemical Engineering, whose team developed computer simulations that helped explain how the chains grew and operated.

The innovation that led to the shape-shifting, said Aayush Shah, a doctoral student in Solomon's group, is the addition of the electric field to control the behavior of the particles. "The particles are like children in a playground," Shah said. "They do interesting things on their own, but it takes a headmaster to make them do interesting things together."

The team started with particles similar to those found in paint, with diameters of about a hundredth the width of a strand of hair. They stretched these particles into football shapes and coated one side of each football with gold. The gilded halves attracted one another in slightly salty water--ideally about half the salt concentration in the sports drink Powerade. The more salt in the water, the stronger the attraction.

Left to their own devices, the particles formed short chains of overlapping pairs, averaging around 50 or 60 particles to a chain. When exposed to an alternating electric field, the chains seemed to add new particles indefinitely. But the real excitement was in the way that the chains stretched.

"We want them to work like little muscles," Glotzer said. "You could imagine many of these fibers lining up with the field and producing locomotion by expanding and contracting." While the force generated by the fibers is about 1,000 times weaker than human muscle tissue per unit area, it may be enough for microbots.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Robot that moves like a caterpillar could go places other robots can't

Robot that moves like a caterpillar could go places other robots can't | Amazing Science |

The peculiar way that an inchworm inches along a surface may not be fast compared to using legs, wings, or wheels, but it does have advantages when it comes to maneuvering in small spaces. This is one of the reasons why researchers have designed and built a soft, worm-like robot that moves with a typical inchworm gait, pulling its body up and extending it forward to navigate its environment. The robots could one day be used in rescue and reconnaissance missions in places that are inaccessible to humans or larger robots.

The researchers, Wei Wang, et al., at Seoul National University in South Korea, have published their paper on the inchworm-inspired robot in a recent issue of Bioinspiration & Biomimetics.

In nature, the inchworm is the larvae phase of the geometer moth and measures about an inch or two long. The small green worm has two or three legs near its front, and two or three foot-like structures called "prolegs" at its rear end. Although they don't have bones, inchworms have complex muscle systems that allow them to perform a variety of body movements, including standing up vertically on their back prolegs.

To mimic the inchworm, the researchers used the soft, highly flexible silicone material PDMS for the robot's body. The researchers built an inchworm mold using a 3D printer, and then poured PDMS solution into the mold. Then they glued small pieces of polyimide film to make feet at the front and rear ends. To play the role of muscle fibers, the researchers used eight longitudinal shape memory alloy (SMA) wires that extend throughout the inchworm robot's body.

By actuating the SMA wires with electric currents, the researchers could cause the inchworm robot's body to move with a natural inchworm gait. Actuating the SMA wires symmetrically causes the robot's body to contract symmetrically, resulting in linear motion. Asymmetrical actuation results in asymmetric deformation and a turning locomotion using one foot as an anchor. In the inchworm gait, the feet must continually change from being used as anchors to sliding in order to generate the push-pull motion. The researchers used alternating low-friction and high-friction foot segments to replicate these foot changes.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Making drones more customizable: First-ever standard “operating system” for drones

Making drones more customizable: First-ever standard “operating system” for drones | Amazing Science |

A first-ever standard “operating system” for drones, developed by a startup with MIT roots, could soon help manufacturers easily design and customize unmanned aerial vehicles (UAVs) for multiple applications.

Today, hundreds of companies worldwide are making drones for infrastructure inspection, crop- and livestock-monitoring, and search-and-rescue missions, among other things. But these are built for a single mission, so modifying them for other uses means going back to the drawing board, which can be very expensive.

Now Airware, founded by MIT alumnus Jonathan Downey ’06, has developed a platform — hardware, software, and cloud services — that lets manufacturers pick and choose various components and application-specific software to add to commercial drones for multiple purposes.

The key component is the startup’s Linux-based autopilot device, a small red box that is installed into all of a client’s drones. “This is responsible for flying the vehicle in a safe, reliable manner, and acts as hub for the components, so it can collect all that data and display that info to a user,” says Downey, Airware’s CEO, who researched and built drones throughout his time at MIT.

To customize the drones, customers use software to select third-party drone vehicles and components — such as sensors, cameras, actuators, and communication devices — configure settings, and apply their configuration to a fleet. Other software helps them plan and monitor missions in real time (and make midflight adjustments), and collects and displays data. Airware then pushes all data to the cloud, where it’s aggregated and analyzed, and available to designated users.

If a company decides to use a surveillance drone for crop management, for instance, it can easily add software that stitches together different images to determine which areas of a field are overwatered or underwatered. “They don’t have to know the flight algorithms, or underlying hardware, they just need to connect their software or piece of hardware to the platform,” Downey says. “The entire industry can leverage that.”

Clients have trialed Airware’s platform over the past year — including researchers at MIT, who are demonstrating delivery of vaccines in Africa. Delta Drone in France is using the platform for open-air mining operations, search-and-rescue missions, and agricultural applications. Another UAV maker, Cyber Technology in Australia, is using the platform for drones responding to car crashes and other disasters, and inspecting offshore oilrigs.

Now, with its most recent $25 million funding round, Airware plans to launch the platform for general adoption later this year, viewing companies that monitor crops and infrastructure — with drones that require specific cameras and sensors — as potential early customers.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Terminator2: Phase-changing material could allow even low-cost robots to switch between hard and soft states

Terminator2: Phase-changing material could allow even low-cost robots to switch between hard and soft states | Amazing Science |

In the movie “Terminator 2,” the shape-shifting T-1000 robot morphs into a liquid state to squeeze through tight spaces or to repair itself when harmed.

Now a phase-changing material built from wax and foam, and capable of switching between hard and soft states, could allow even low-cost robots to perform the same feat.

The material — developed by Anette Hosoi, a professor of mechanical engineering and applied mathematics at MIT, and her former graduate student Nadia Cheng, alongside researchers at the Max Planck Institute for Dynamics and Self-Organization and Stony Brook University — could be used to build deformable surgical robots. The robots could move through the body to reach a particular point without damaging any of the organs or vessels along the way.

Robots built from the material, which is described in a new paper in the journal Macromolecular Materials and Engineering, could also be used in search-and-rescue operations to squeeze through rubble looking for survivors, Hosoi says.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

A self-organizing thousand-robot swarm can form any shape

A self-organizing thousand-robot swarm can form any shape | Amazing Science |

Following simple programmed rules, autonomous robots arrange themselves into vast, complex shapes.

“Form a sea star shape,” directs a computer scientist, sending the command to 1,024 little bots simultaneously via an infrared light. The robots begin to blink at one another and then gradually arrange themselves into a five-pointed star. “Now form the letter K.”

The ‘K’ stands for Kilobots, the name given to these extremely simple robots, each just a few centimeters across, standing on three pin-like legs. Instead of one highly-complex robot, a “kilo” of robots collaborate, providing a simple platform for the enactment of complex behaviors.

Just as trillions of individual cells can assemble into an intelligent organism, or a thousand starlings can form a great flowing murmuration across the sky, the Kilobots demonstrate how complexity can arise from very simple behaviors performed en masse (see video). To computer scientists, they also represent a significant milestone in the development of collective artificial intelligence (AI).

View on web

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Matter of Speed: How This Robot Wins Rock-Paper-Scissors Every Single Time Agains Humans – It Cheats

Matter of Speed: How This Robot Wins Rock-Paper-Scissors Every Single Time Agains Humans – It Cheats | Amazing Science |

You can't win rock-paper-scissors 100 percent of the time -- at least not if you're human. Even those well-versed in rock-paper-scissors strategy lose sometimes; that's how we get enough interest for rock-paper-scissors championships. But there's not an RPS savant in the world who can beat the newly unveiled Japanese "Janken" robot at the game.

Named after the Japanese version of rock-paper-scissors, Janken will win the game against a human every time it plays. How? The robot uses high-speed computer visionto see which symbol its human opponent's hand is forming and then, quicker than the eye can see, forms a winning symbol in response.

Think you're fast enough to beat the robot? Well, Janken can detect your movement in just one millisecond and make its own shape in just 20 milliseconds. For reference,it takes 40 milliseconds for the human eye to process a moving image; so we're pretty sure that no matter how good at rock-paper-scissors you are, you can't win.

No comment yet.