Amazing Science
Follow
Find tag "robotics"
342.7K views | +162 today
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Software Uses Cyborg Swarm of Cockroaches To Map Unknown Environments

Software Uses Cyborg Swarm of Cockroaches To Map Unknown Environments | Amazing Science | Scoop.it

Researchers from North Carolina State University have developed software that allows them to map unknown environments – such as collapsed buildings – based on the movement of a swarm of insect cyborgs, or “biobots.”

 

“We focused on how to map areas where you have little or no precise information on where each biobot is, such as a collapsed building where you can’t use GPS technology,” says Dr. Edgar Lobaton, an assistant professor of electrical and computer engineering at NC State and senior author of a paper on the research.

 

“One characteristic of biobots is that their movement can be somewhat random,” Lobaton says. “We’re exploiting that random movement to work in our favor.”

 

Here’s how the process would work in the field. A swarm of biobots, such as remotely controlled cockroaches, would be equipped with electronic sensors and released into a collapsed building or other hard-to-reach area. The biobots would initially be allowed to move about randomly. Because the biobots couldn’t be tracked by GPS, their precise locations would be unknown. However, the sensors would signal researchers via radio waves whenever biobots got close to each other.

 

Once the swarm has had a chance to spread out, the researchers would send a signal commanding the biobots to keep moving until they find a wall or other unbroken surface – and then continue moving along the wall. This is called “wall following.”

 

The researchers repeat this cycle of random movement and “wall following” several times, continually collecting data from the sensors whenever the biobots are near each other. The new software then uses an algorithm to translate the biobot sensor data into a rough map of the unknown environment.

 

“This would give first responders a good idea of the layout in a previously unmapped area,” Lobaton says. The software would also allow public safety officials to determine the location of radioactive or chemical threats, if the biobots have been equipped with the relevant sensors.


The researchers have tested the software using computer simulations and are currently testing the program with robots. They plan to work with fellow NC State researcher Dr. Alper Bozkurt to test the program with biobots.

 

The paper, “Topological Mapping of Unknown Environments using an Unlocalized Robotic Swarm,” will be presented at the International Conference on Intelligent Robots and Systems being held Nov. 3-8 in Tokyo, Japan.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

'Terminator arm' churned out of 3D printer

'Terminator arm' churned out of 3D printer | Amazing Science | Scoop.it

Transparent plastic arm shows how 3D printers can create strong structure, mobile joints and delicate sensors in one process.

 

It may look like a sci-fi movie prop, but it could be a glimpse at the future of prosthetics. 3D printing can render everyday artefacts in clear plastic, so we can see in unprecedented detail how they work – and this exquisite model of a prosthetic arm is a brilliant example. It is one of the highlights at the London Science Museum's 3D printing exhibition, which features more than 600 printed objects.


Designed by Richard Hague, director of the Additive Manufacturing and 3D Printing Research Group at the University of Nottingham, UK, and his students the arm shows how the printers can create strong structure, mobile joints and delicate sensors – like spiral-shaped metal touch-detectors – all in one process.

 

"It's a mock-up but it shows circuits that sense temperature, feel objects and control the arm's movement," says Hague. "3D printing gives us the freedom to make complex, optimised shapes, and our research aim is focused on printing-in electrical, optical or even biological functions."

 

Such techniques are also bringing prosthetics to people who previously could not afford them. For instance, the open-source "robohand" project, pioneered by South African carpenter Richard Van As, aims to print cheap, plastic customised prostheses for people who have lost fingers, or who were born with some digits missing or malformed. Some of his work – with the designs available online – is also on show at the Science Museum.

more...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Boston Dynamics Unleashes "Cheetah" - World's Fastest Running Robot

Boston Dynamics Unleashes "Cheetah" - World's Fastest Running Robot | Amazing Science | Scoop.it
Boston Dynamics is the leading provider of human simulation software, tools, and solutions. Organizations worldwide use its products and services for simulation-based training, mission-planning, analysis, and virtual prototyping applications.

 

The Cheetah robot is the fastest legged robot in the World, surpassing 29 mph, a new land speed record for legged robots. The previous record was 13.1 mph, set in 1989 at MIT. The Cheetah robot has an articulated back that flexes back and forth on each step, increasing its stride and running speed, much like the animal does. The current version of the Cheetah robot runs on a high-speed treadmill in the laboratory where it is powered by an off-board hydraulic pump and uses a boom-like device to keep it running in the center of the treadmill. The next generation Cheetah robot, WildCat, is designed to operate untethered. WildCat recently entered initial testing and is scheduled for outdoor field testing later in 2013.


Cheetah robot development is funded by DARPA's Maximum Mobility and Manipulation program.

 
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

For a robot, control is good, freedom is better and a chaotic system will stabilize itself

For a robot, control is good, freedom is better and a chaotic system will stabilize itself | Amazing Science | Scoop.it
Chaos, as found in robot control systems, can be stabilised quicker with less control.

 

When chaos threatens, speed is essential; for example, when a pacemaker needs to stabilise an irregular heartbeat or a robot has to react to the information received from its environment. Both cases require imposing a stable, organised state on a chaotic system. Scientists from the Max Planck Institute for Dynamics and Self-Organization in Göttingen, the Bernstein Center for Computational Neuroscience Göttingen and the University of Göttingen have developed a method for accelerating control. The key to success: A less invasive approach that cleverly exploits the natural behaviour of the system.


When the ground beneath Amos starts to rise, the insectoid robot can skilfully adapt to the changing conditions. After only a moment’s hesitation, he autonomously switches gait and selects a different movement pattern for his six legs, suitable for climbing the slope. To do this, Amos’ “brain”, a comparatively tiny network with few circuits, has to work at full tilt. Can this “thought process” be accelerated? Scientists in Göttingen think so. Their calculations show how Amos’ reaction times can be significantly reduced.

 

The autonomous six-legged robot was developed three years ago and subsequently optimised by a team led by theoretical physicist Marc Timme, who, together with his Research Group, works at the Max Planck Institute for Dynamics and Self-Organization and headed the new study along with robotics expert Poramate Manoonpong from the University of Göttingen. However, the new method is not just suitable for robots such as Amos; basically, it can be applied to any chaotic system where a certain degree of control is required. “Every chaotic system is very susceptible to interference”, Marc Timme explains. Even the smallest external change may trigger a completely different behaviour. In Amos’ case, chaos means that his “brain” would produce a chaotic activity pattern with signals flying in all directions.

 

In order to organise this chaotic pattern, the system requires help. Scientists speak of “chaos control”. The most common methods used begin by trying to calculate the behaviour of the system in the near future. The second step is to transform this information into a control signal which is used to correct the development of the system – a gentle nudge to bring it back on track.

 

However, the Göttingen-based research team has demonstrated that less intervention can be more effective. “The trick is to limit the number of times we push the system towards the required stable state”, says Max Planck researcher Christian Bick. “By giving the system the freedom to develop on its own from time to time, we achieve the desired result faster.”  Physicists call this a self-organised process.

 

“At first glance, this method may seem roundabout”, Bick admits. However, the self-stabilisation of the system is actually very efficient and fast. Only occasional external interventions are required to make sure that the path chosen by the system does not deviate from the right track. Depending on the system, the new method may easily be 100 or 1000 times faster, and requires significantly fewer interventions. “What’s more, theoretically this would permit stabilisation of very complex movement patterns for Amos”, Timme adds.

 


more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

NASA launches robotic explorer to moon, orbiting craft will study lunar atmosphere and dust

NASA launches robotic explorer to moon, orbiting craft will study lunar atmosphere and dust | Amazing Science | Scoop.it

NASA aiming for the moon again, this time from Virginia. NASA's newest robotic explorer rocketed into space late Friday in an unprecedented moonshot from Virginia that dazzled sky watchers along the East Coast of the U.S.

 

But the LADEE spacecraft quickly ran into equipment trouble, and while NASA assured everyone early Saturday that the lunar probe was safe and on a perfect track for the moon, officials acknowledged the problem needs to be resolved in the next two to three weeks.

 

It was a change of venue for NASA, which normally launches moon missions from Cape Canaveral, Florida. But it provided a rare light show along the East Coast for those blessed with clear skies.

 

NASA urged sky watchers to share their launch pictures through the website Flickr, and the photos and sighting reports quickly poured in from New York City, Boston, Washington, D.C., Baltimore, New Jersey, Rhode Island, eastern Pennsylvania and Virginia, among other places.

 

The Lunar Atmosphere and Dust Environment Explorer or LADEE, pronounced "LA'-dee," is taking a roundabout path to the moon, making three huge laps around Earth before getting close enough to pop into lunar orbit.

 

Unlike the quick three-day Apollo flights to the moon, LADEE will need a full month to reach Earth's closest neighbor. An Air Force Minotaur V rocket, built by Orbital Sciences Corp., provided the ride from NASA's Wallops Flight Facility.

 

LADEE, which is the size of a small car, is expected to reach the moon on Oct. 6. Scientists want to learn the composition of the moon's ever-so-delicate atmosphere and how it might change over time. Another puzzle, dating back decades, is whether dust actually levitates from the lunar surface.

 

The $280 million moon-orbiting mission will last six months and end with a suicide plunge into the moon for LADEE.

 

The 844-pound (380-kilogram) spacecraft has three science instruments as well as laser communication test equipment that could revolutionize data relay. NASA hopes to eventually replace its traditional radio systems with laser communications, which would mean faster bandwidth using significantly less power and smaller devices.

 

"There's no question that as we send humans farther out into the solar system, certainly to Mars," that laser communications will be needed to send high-definition and 3-D video, said NASA's science mission chief, John Grunsfeld, a former astronaut who worked on the Hubble Space Telescope.

more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Robots and Robotics
Scoop.it!

Robotic Insect Eyes Destined for Next-Generation Micro Drone Development

Robotic Insect Eyes Destined for Next-Generation Micro Drone Development | Amazing Science | Scoop.it

While consumer cameras are inspired from the single-lens mammalian eye, most animal species use compound eyes, which consist of a dense mosaic of tiny eyes. Compared to single-lens eyes, compound eyes offer lower resolution, but significantly larger fields of view, thin package, and with negligible distortion, all features which are very useful for motion detection in tasks such as collision avoidance, distance estimation, take-off and landing. Attempts have recently been made to develop artificial compound eyes, but none of the solutions proposed so far included fast motion detection in a very large range of illuminations as insects do.

The novel curved artificial compound eye (CurvACE) features a panoramic, hemispherical field of view of 180°x60°, with a resolution identical to that of the fruitfly in less than 1 mm thickness. Additionally, it can extract images at 1500 frames per second with a 300 Hz signal bandwidth, which is 3 times faster than fruitfly, and includes neuromorphic photoreceptors that allow motion perception in a wide range of environments from a sunny day to moon light (~1 lux). 

Furthermore, the artificial compound eye possesses embedded and programmable vision processing, which allows customizable integration in a broad range of applications where motion detection is important, such as mobile (micro)robots/micro air vehicles (MAVs), home automation, surveillance, medical instruments, prosthetic devices, and smart clothing.


Via The Robot Launch Pad, Kalani Kirk Hausman
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Artificial robot muscles can lift loads 80 times their weight

Artificial robot muscles can lift loads 80 times their weight | Amazing Science | Scoop.it

National University of Singapore’s (NUS) engineers have created efficient artificial muscles that could one day carry 80 times their own weight and extend to five times their original length when carrying the load.

 

The team’s invention could lead to life-like robots with superhuman strength and ability and convert and store energy, which could help the robots quickly charge themselves.

 

“Our materials mimic those of the human muscle, responding quickly to electrical impulses, instead of using mechanisms driven by hydraulics,” which create the slow, jerky movements of robots, said Dr Adrian Koh from NUS’ Engineering Science Program and Department of Civil and Environmental Engineering, Faculty of Engineering.

 

“Now, imagine artificial muscles that are pliable, extendable and react in a fraction of a second, like those of a human. Robots equipped with such muscles would be able to function in a more human-like manner — and outperform humans in strength.”

 

The researchers plan to create robots and robotic limbs that are more human-like in both functions and appearance — and more powerful. In less than five years, they expect to develop a robotic arm about half the size and weight of a human arm that can out-wrestle a person.

 

The secret: polymers that could stretch move than 10 times their original length (a strain displacement of 1,000 per cent), lifting a load of up to 500 times their own weight. Also, as the muscles contract and expand, they are capable of converting mechanical energy into electrical energy. A 10kg electrical generator built from these soft materials would be capable of producing the same amount of energy as a one-ton electrical turbine,” Koh said.

 

This means that the energy generated may lead to a robot being self-powered after less than a minute of charging, he said. “Think of how efficient cranes could get when armed with such muscles,” he added.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Crab-Like Robot Walks Along the Ocean Floor to Investigate Shipwrecks

Crab-Like Robot Walks Along the Ocean Floor to Investigate Shipwrecks | Amazing Science | Scoop.it

As six-legged robots go, other than its nifty red and yellow paint job, the Crabster robot has a pretty standard look. It isn’t the biggest hexapod, like the impressive two-ton Mantis, or a tiny hexapod with a weird gait, like Boston Dynamics’ RHex. What makes Crabster special isn’t so much what it is but where it will walk—the robot was designed to navigate the seafloor.

 

Ocean researchers already use both autonomous and remote-control undersea vehicles, but propulsion systems tend to kick up sediment, adversely affecting visibility, and lack the power to deal with strong currents.

Crabster’s creators designed the robot to solve these problems. Developed by the Korea Institute of Ocean Science and Technology (KIOST), the robot can withstand heavy currents by changing its posture (roll, pitch, and yaw), and the robot’s measured gait won’t significantly disturb sediment.

 

Crabster is lowered to the seafloor by crane and remains attached to an umbilical for power, limiting where it can go but allowing for continuous operation. Four operators remotely drive the robot from the surface—directing and monitoring its movement, manipulators, cameras, lights, and sonar.

On the seafloor, the half-ton robot illuminates murky water with a spotlight, records what it sees with ten onboard cameras, and uses its two front legs to pick up and manipulate objects. Researchers hope to send Crabster to explore shipwrecks where they can return small treasures in the robot’s retractable tray. They’ll haul larger objects by attaching a tow cable connected to the vessel above.

 

Crabster recently took its first dip in the ocean and will soon head out to sea to begin work 200 meters below the surface. Eventually Crabster’s engineers hope to give it an onboard power source, and we imagine future iterations might combine the best of both worlds—a Crabster that folds its legs to go swimming and, when a stroll better suits its purposes, deploys its legs for a landing on the sea-floor.

more...
Ron Peters's curator insight, October 17, 2013 10:08 AM

Interesting ROV/AUV twist...

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Visual Odometry for GPS-denied Flight and Environment Mapping using a Kinect mounted to a Quadrotor

Visual Odometry for GPS-denied Flight and Environment Mapping using a Kinect mounted to a Quadrotor | Amazing Science | Scoop.it

A robotics group at MIT has developed a real-time visual odometry system that can use a Kinect to provide fast and accurate estimates of a vehicle's 3D trajectory. This system is based on recent advances in visual odometry research, and combines a number of ideas from the state-of-the-art algorithms. It aligns successive camera frames by matching features across images, and uses the Kinect-derived depth estimates to determine the camera's motion.

 

The group has integrated the visual odometry into our Quadrotor system, which was previously developed for controlling the vehicle with laser scan-matching. The visual odometry runs in real-time, onboard the vehicle, and its estimates have low enough delay that we are successfully able to control the quadrotor using only the Kinect and onboard IMU, enabling fully autonomous 3D flight in unknown GPS-denied environments. Notably, it does not require a motion capture system or other external sensors -- all sensing and computation required for local position control is done onboard the vehicle.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Drones Close In On Farms, The Next Step In Precision Agriculture

Drones Close In On Farms, The Next Step In Precision Agriculture | Amazing Science | Scoop.it

Drones continue their steady approach into the different aspects of our lives. But while controversy rages over drone devastation over foreign soil and prying surveillance over US soil, experts are beginning to point our attention to the real future of unmanned aerial vehicles: farming.

 

Drones are expected to benefit farms both big and small – small farms can save money and resources through greater precision, big farms can map and characterize crop health and yield, for example, of large areas more easily. Such land monitoring was once performed on foot, with farmers seeing for themselves which areas need more water or fertilizer. With the advent of precision agriculture, remote sensing has already become vital to many large farm operations. Satellites and aircraft take pictures in infrared to determine water distribution and movement, as well as weed coverage.

 

Thermal infrared sensors that measure heat can determine crop health from afar. Tractor booms are also being fitted with the multi-spectral cameras so that they can take measurements simultaneous with doing their jobs. But now drones can offer on demand images much more inexpensively.

 

High performance GPS allows these aerial farmers to be controlled with precision and remain stable. The CropCam is an RC glider plane equipped with a Pentax digital camera. It’s operated manually or preprogrammed on the ground to collect aerial photos to provide imagery for agriculture, forestry, environmental and other uses. Another is Airrobot’s ARB100-B used in France for agricultural surveying. And far ahead of their American counterparts, over 2,400 Yamaha RMAX Unmanned Helicopters are already tending to farmland across Japan, South Korea and Australia (these top-of-the-line machines cost $125,000 each).

 

And the RMAX can do more than just monitor, it can actually help with the farming. It’s equipped with a sprayer that can disperse granules, coated grains and fertilizers. The drone is smart enough to tell operators when airspeed is too high for optimal spread of material. And offering the farmer a rare opportunity to scale, up to six RMAXs can be operated simultaneously – tractors will need another gear just to keep up.

 

 

As more farmers choose drones over tractors, aerial farming is expected to make a big impact on the unmanned aerial vehicle (UAV) market. A report, published earlier this month by the Association for Unmanned Vehicle Systems International (AUVSI), estimates that 90 percent of potential markets for UAVs will be accounted for by  public safety and precision agriculture. The FAA recently released a list of towns that have applied for UAV approval through October 2012 – there are 81 in all. The report predicts that widespread adoption of UAVs will inject $82 billion in economic activity and generate up to 100,000 new jobs between 2015 and 2025.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

World Science Festival: Self-Aware Robots and Living among Thinking Machines

In recent years, machines have grown increasingly capable of listening, communicating, and learning—transforming the way they collaborate with us, and significantly impacting our economy, health, and daily routines. Who, or what, are these thinking machines? As we teach them to become more sophisticated, how will they complement our lives? What will separate their ways of thinking from ours? And what happens when these machines understand data, concepts, and behaviors too big or impenetrable for humans to grasp? We were joined by IBM's WATSON, the computer Jeopardy! champion, along with leading roboticists and computer scientists, to explore the thinking machines of today and the possibilities to come in the not-too-distant future.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Creating a sense of touch in a prosthetic hand

Creating a sense of touch in a prosthetic hand | Amazing Science | Scoop.it

Scientists have made tremendous advances toward building lifelike prosthetic limbs that move and function like the real thing. But what’s missing is a sense of touch, so a patient knows how hard he or she is actually squeezing something, or exactly where the object is positioned relative to his or her hand.


“If you lose your somatosensory [body senses] system, it almost looks like your motor system is impaired,” he said. “If you really want to create an arm that can actually be used dexterously without the enormous amount of concentration it takes without sensory feedback, you need to restore the somatosensory feedback.”


This is the related to a similar problem with robots (see “Related” below), where researchers have built better sensors into their the robots’ limbs and hands, along with better processing systems and control systems. 

 

So a team of University of Chicago neurobiologists, headed by Sliman Bensmaia, assistant professor of organismal biology and anatomy, came up with an idea: why not try the same thing, starting with a monkey?

To restore the somatosensory feedback, they equipped a robotic hand with pressure sensors.

 

These send electrical signals for processing, and from there to electrodes implanted in the brain to recreate the same response to touch as a real hand. The researchers used rhesus macaques that were trained to respond to stimulation of the hand. Their hands were hidden so they wouldn’t see that they weren’t actually being touched, and were given electrical pulses to simulate the sensation of touch.


The animals had electrodes implanted into the area of the brain that responds to touch to check the animals’ responses to each type of stimulus. By combining the poking and brain-response data, the researchers were able to create a mathematical function that described the level of electrical pulses in the brain corresponding to different levels of physical pokes of the hand.


Then,switched to a prosthetic hand that was wired to the brain implants. They touched the prosthetic hand with the physical probe, which in turn sent similar electrical signals to the brain. Bensmaia said that the animals performed identically whether poked on their own hand or on the prosthetic one.


“This is the first time as far as I know where an animal or organism actually perceives a tactile stimulus through an artificial transducer,” Bensmaia said.

“It’s an engineering milestone. But from a neuroengineering standpoint, this validates this function. You can use this function to have an animal perform this very precise task, precisely identically.”


more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Remembering tomorrow
Scoop.it!

Activision Shows Animated Human That Looks So Real, It's Uncanny

Activision Shows Animated Human That Looks So Real, It's Uncanny | Amazing Science | Scoop.it

Activision showed off the state of the art of real-time graphics on Wednesday, releasing this mind-boggling character demo. The character's skin, facial expressions and eyes look so real, it's uncanny.

 

When you watch this video, see if you think this character has reached the other side of what's commonly called the "uncanny valley," a term first uttered by early robotics guruMasahiro Mori in 1970. It describes the range of sophistication of animated graphics, from one side of the valley where human figures simply look unrealistic, to the middle of the valley — where they look just realistic enough to be creepy — to our side of the valley, where animation is indistinguishable from reality.

 

Whenever the uncanny valley is mentioned, the animation techniques from the November, 2004 movie Polar Express come to mind. Most viewers noticed the characters weren't quite photorealistic enough to keep them out of the creepy zone. But that was nearly 8 years ago, and graphics technology has made spectacular progress since then.


Via Marco Bertolini
more...
Marco Bertolini's curator insight, March 28, 2013 1:59 PM

Une vidéo bluffante : ce monsieur est en réalité une animation réalisée par la société Activision.  Une reconstitution incroyable des expressions faciales, de la texture de la peau, etc.

CAEXI BEST's curator insight, May 8, 2013 10:14 PM
Activision Montre homme animé qui semble si réel, c'est Uncanny
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Theo Jansen, artist, inventor of robots powered by the wind only

Theo Jansen is the Dutch creator of what he calls "Kinetic Sculptures," where nature and technology meet. Essentially these sculptures are robots powered by the wind only.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Google Lunar X Prize: The $30 million dollar competition for the first privately funded robot to the moon [VIDEOS]

Google Lunar X Prize: The $30 million dollar competition for the first privately funded robot to the moon [VIDEOS] | Amazing Science | Scoop.it

25 teams from around the world are currently building robots, rockets, and lunar landers to win the $30 million Google Lunar X PRIZE. Every year, we collect hardware video clips from the teams and showcase their progress. 

This year shows some impressive advancements in the rover designs, propulsion and avionics technology. Teams are stepping it up as the competition thickens and with all the recent headlining developments, the Moon does not seem so far away. 

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

MIT: Surprisingly simple scheme for self-assembling robots

MIT: Surprisingly simple scheme for self-assembling robots | Amazing Science | Scoop.it
Small cubes with no exterior moving parts can propel themselves forward, jump on top of each other, and snap together to form arbitrary shapes.

 

Known as M-Blocks, the robots are cubes with no external moving parts. Nonetheless, they're able to climb over and around one another, leap through the air, roll across the ground, and even move while suspended upside down from metallic surfaces.

Inside each M-Block is a flywheel that can reach speeds of 20,000 revolutions per minute; when the flywheel is braked, it imparts its angular momentum to the cube. On each edge of an M-Block, and on every face, are cleverly arranged permanent magnets that allow any two cubes to attach to each other.


“It’s one of these things that the [modular-robotics] community has been trying to do for a long time,” says Rus, a professor of electrical engineering and computer science and director of CSAIL. “We just needed a creative insight and somebody who was passionate enough to keep coming at it — despite being discouraged.”


To compensate for its static instability, the researchers’ robot relies on some ingenious engineering. On each edge of a cube are two cylindrical magnets, mounted like rolling pins. When two cubes approach each other, the magnets naturally rotate, so that north poles align with south, and vice versa. Any face of any cube can thus attach to any face of any other.

The cubes’ edges are also beveled, so when two cubes are face to face, there’s a slight gap between their magnets. When one cube begins to flip on top of another, the bevels, and thus the magnets, touch. The connection between the cubes becomes much stronger, anchoring the pivot. On each face of a cube are four more pairs of smaller magnets, arranged symmetrically, which help snap a moving cube into place when it lands on top of another.

As with any modular-robot system, the hope is that the modules can be miniaturized: the ultimate aim of most such research is hordes of swarming microbots that can self-assemble, like the “liquid steel” androids in the movie “Terminator II.” And the simplicity of the cubes’ design makes miniaturization promising.

But the researchers believe that a more refined version of their system could prove useful even at something like its current scale. Armies of mobile cubes could temporarily repair bridges or buildings during emergencies, or raise and reconfigure scaffolding for building projects. They could assemble into different types of furniture or heavy equipment as needed. And they could swarm into environments hostile or inaccessible to humans, diagnose problems, and reorganize themselves to provide solutions.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Human robot getting closer: iCub robot must learn from its experiences

Human robot getting closer: iCub robot must learn from its experiences | Amazing Science | Scoop.it
A robot that feels, sees and, in particular, thinks and learns like us. It still seems like science fiction, but new research hints that it could happen. Scientists are working to implement the cognitive process of the human brain in robots.

 

The research should lead to the arrival of the latest version of the iCub robot in Twente. This human robot (humanoid)blurs the boundaries between robot and human.


Decades of scientific research into cognitive psychology and the brain have given us knowledge about language, memory, motor skills and perception. We can now use that knowledge in robots, but Frank van der Velde's research goes even further. "The application of cognition in technical systems should also mean that the robot learns from its experiences and the actions it performs. A simple example: a robot that spills too much when pouring a cup of coffee can then learn how it should be done."


The arrival of the iCub robot at the University of Twente should signify the next step in this research. Van der Velde submitted an application together with other UT researchers Stefano Stramigioli, Vanessa Evers, Dirk Heylen and Richard van Wezel, all active in the robotics and cognitive research. At the moment, twenty European laboratories have an iCub, which was developed in Italy (thanks to a European FP7 grant for the IIT). The Netherlands is still missing from the list. Moreover, a newer version is currently being developed, with for example haptic sensors. In February it will be announced whether the robotics club will actually bring the latest iCub to the UT. The robot costs a quarter of a million Euros and NWO (Netherlands Organisation for Scientific Research) will reimburse 75% of the costs. Then the TNO (Netherlands Organisation for Applied Scientific Research) and the universities of Groningen, Nijmegen, Delft and Eindhoven can also make use of it. Within the UT, the iCub can be deployed in different laboratories thanks to a special transport system.


The possibilities are endless, according to Van der Velde. "The new iCub has a skin and fingers that have a much better sense of touch and can feel strength. That makes interaction with humans much more natural. We want to ensure that this robot continues to learn and understands how people function. This research ensures, for example, that robots actually gather knowledge by focusing on certain objects or persons. In areas of application like healthcare and nursing, such robots can play an important role. A good example would be that in ten years' time you see a blind person walking with a robot guide dog."


Video: http://www.youtube.com/watch?v=ZcTwO2dpX8A

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Festo BionicOpter: Robot Flies Like A Real Dragonfly

Festo BionicOpter: Robot Flies Like A Real Dragonfly | Amazing Science | Scoop.it

With the BionicOpter, Festo has technically mastered the highly complex flight characteristics of the dragonfly. Just like its model in nature, this ultralight flying object can fly in all directions, hover in mid-air and glide without beating its wings.

 

Thirteen degrees of freedom for unique flight manoeuvres: In addition to control of the shared flapping frequency and twisting of the individual wings, each of the four wings also features an amplitude controller. The tilt of the wings determines the direction of thrust. Amplitude control allows the intensity of the thrust to be regulated. When combined, the remote-controlled dragonfly can assume almost any position in space.

 

This unique way of flying is made possible by the lightweight construction and the integration of functions: components such as sensors, actuators and mechanical components as well as open- and closed-loop control systems are installed in a very tight space and adapted to one another.

 

With the remote-controlled dragonfly, Festo demonstrates wireless real-time communication, a continuous exchange of information, as well as the ability to combine different sensor evaluations and identify complex events and critical states.

 

more...
Karlos Svoboda's comment, September 6, 2013 2:04 AM
Real Dragon fly mě přivádí k šílenství
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Miniature Quadruped Robot Is Blazingly Fast And Travels At Over 30 Body Lengths Per Second

Miniature Quadruped Robot Is Blazingly Fast And Travels At Over 30 Body Lengths Per Second | Amazing Science | Scoop.it

This robot, which still doesn't have a name, is very compact (which measures just 6.5 x 5.5 x 1 centimeter), and according to its creators it is quite possibly "the fastest legged robot of its size." Whether or not this really is a legged robot (or a quadruped) is perhaps debatable: these are wheel-legs, more commonly known as whegs. They're wheels in that there's rotary motion going on, but they're also legs in that there are discrete points of contact with the ground. To some extend, whegs offer the best of both worlds: they can be directly driven with conventional motors and allow for high speed and efficiency, while simultaneously providing traction over rough terrain and obstacles. Plus, you can easily swap them out, and by making them out of springy materials, you can give your robot some compliance. 

 

What makes the robot wicked fast is the fact that it's got four independent drive motors, each one of which has a power to weight ratio that's absolutely bananas. Only 6 millimeters in size each, the motors output 1.5 watts of power at 40,000 RPM, driving the individual whegs through 16:1 planetary gearheads. They're not cheap (hundreds of dollars each), but they make for one crazy little robot. And of course, independently driven whegs make the robot smaller, lighter, simpler to steer, and generally more efficient overall.

 

The current generation of this robot isn't capable of taking advantage of all of the power that the motors offer: even at top speed, it's only using about 0.60 watt, less than half of what the motors can output, since increasing wheel speed causes the robot to bounce along the ground, decreasing its actual speed. But, there's a lot of potential for swapping in some new whegs up to 35 mm in length (about twice as long as those currently on the robot), "which might produce even faster running speeds and the ability to navigate very large obstacles or challenging terrain, with a robot that still fits in your hand."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Amazing Science: Robotics Postings

Amazing Science: Robotics Postings | Amazing Science | Scoop.it

Robotics is the branch of technology that deals with the design, construction, operation, and application of robots, as well as computer systems for their control, sensory feedback, and information processing. These technologies deal with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition. Today, robotics is a rapidly growing field, as technological advances continue, research, design, and building new robots, and any of today's robots are inspired by nature. 

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

A groundbreaking project called Aireal lets you actually feel virtual objects

A groundbreaking project called Aireal lets you feel virtual objects. Aireal is the result of research by University of Illinois PhD student Rajinder Sodhi and Disney Reseach’s Ivan Poupyrev. When set by your television or connected to an iPad, this diminutive machine will puff air rings that allow you to actually feel objects and textures in midair — no special controllers or gloves required.

 

The machine itself is essentially a set of five speakers in a box — subwoofers that track your body through IR, then fire low frequencies through a nozzle to form donut-like vortices.

 

In practice, Aireal can do anything from creating a button for you to touch in midair to crafting whole textures by pulsing its bubbles to mimic water, stone, and sand. … A single Aireal could conceivably support multiple people, and a grid of Aireals could create extremely immersive rooms, creating sensations like a flock of birds flying by.

more...
Marie Rippen's curator insight, July 24, 2013 11:15 AM

Besides entertainment, this could have applications in physical therapy, education, advertising--anything you can think of where communicating the sensation of touch is important. Although, the first thing that popped into my head was Star Trek... holodeck anyone?

 

 
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Operations in China will soon be performed by American doctors in Texas, via robots

Operations in China will soon be performed by American doctors in Texas, via robots | Amazing Science | Scoop.it
A new partnership between two hospitals in China and the US will soon have Chinese patients on an operating table with a robot standing over them. At the controls will be a US doctor in Texas.
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Microsoft's Robotic Touch-Back Screen Lets You Feel Structures

Microsoft's Robotic Touch-Back Screen Lets You Feel Structures | Amazing Science | Scoop.it

Touch screens are nothing new, but this prototype from Microsoft uses a robot-mounted display to do something surprising: touch back.

Early this year at TechFest, Microsoft Research showed off a number of coolnew user interaction applications. One of them is a prototype of a haptic feedback touch screen called TouchMover. The company is preparing an announcement with more details about the technology, but here's a sneak peak.

 

The robotic system behind the curtain pushes back with a pressure that reflects the physical properties of virtual objects on the screen. A granite block in a 3D playground is harder to move than a wooden block, while plastic beach balls are light and, as you move your finger around it, you can feel its "roundness." I got a chance to try it, and it's a little freaky to use, but remarkable.

 

Researchers uploaded a full set of MRI brain scans and demoed how doctors might scroll through them and annotate specific slides. And with some additional programming, the researchers could also make the TouchMover provide haptic feedback based on the material properties and texture of the skull bone and pulpy brain tissue, making the screen feel like palpating an actual brain.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Mind control successfully used to pilot helicopter. No invasive surgery required, only an EEG scanner cap

Mind control successfully used to pilot helicopter. No invasive surgery required, only an EEG scanner cap | Amazing Science | Scoop.it

Researchers at the University of Minnesota have designed an interface that allows humans to control a robot using only their thoughts.

 

How close are we getting to actual brain control? It's starting to seem not far off at all. On the more silly end of the spectrum, we've seen robotic ears and tails that respond to brainwaves; but we've also seen more recently a Chilean company that has created abrain interface for designing printable objects, a mind-controlled exoskeleton for helping people walk, even mind-to-mind communication.

 

A team of researchers at the University of Minnesota has just added another exciting new technology to the list: a quadcopter that can perform feats of aerial agility, controlled entirely by the pilot's thoughts.

Using electroencephalography (EEG), a non-invasive cap fitted with 64 electrodes reads the electrical impulses of the brain to control the copter. Thinking of making a fist with the left hand, for example, fires off certain neurons in the brain's motor cortex; the cap interprets this pattern and sends a command to the copter to turn left. Other commands include thinking of making a fist with the right hand to turn right, and making two fists to tell the copter to rise.

 

In this way, five subjects — two male and three female — were able to successfully pilot the quadcopter quickly and accurately for a sustained period of time through an obstacle course in the university's gymnasium.

Professor Bin He, lead author of the study "Quadcopter control in three-dimensional space using a non-invasive motor imagery-based brain-computer interface", hopes that the research will be developed to create solutions for the disabled. "Our next goal is to control robotic arms using non-invasive brain wave signals, with the eventual goal of developing brain-computer interfaces that aid patients with disabilities or neurodegenerative disorders," he said.

 

This will not be the first mind-controlled robotic arm; however, the robotic arm announced in December last year requires a brain implant. His solution is much less invasive, requiring no surgery to implant the interface.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Robotic insects make first controlled flight

Robotic insects make first controlled flight | Amazing Science | Scoop.it

In culmination of a decade's work, RoboBees achieve vertical takeoff, hovering, and steering.

 

In the very early hours of the morning, in a Harvard robotics laboratory last summer, an insect took flight. Half the size of a paperclip, weighing less than a tenth of a gram, it leapt a few inches, hovered for a moment on fragile, flapping wings, and then sped along a preset route through the air.

Like a proud parent watching a child take its first steps, graduate student Pakpong Chirarattananon immediately captured a video of the fledgling and emailed it to his adviser and colleagues at 3 a.m.—subject line, "Flight of the RoboBee."

 

"I was so excited, I couldn't sleep," recalls Chirarattananon. The demonstration of the first controlled flight of an insect-sized robot is the culmination of more than a decade's work, led by researchers at the Harvard School of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically Inspired Engineering at Harvard.

 

"This is what I have been trying to do for literally the last 12 years," says Robert J. Wood, Charles River Professor of Engineering and Applied Sciences at SEAS, Wyss Core Faculty Member, and principal investigator of the National Science Foundation-supported RoboBee project. "It’s really only because of this lab’s recent breakthroughs in manufacturing, materials, and design that we have even been able to try this. And it just worked, spectacularly well."

 

Inspired by the biology of a fly, with submillimeter-scale anatomy and two wafer-thin wings that flap almost invisibly, 120 times per second, the tiny device not only represents the absolute cutting edge of micromanufacturing and control systems; it is an aspiration that has impelled innovation in these fields by dozens of researchers across Harvard for years.

 

"We had to develop solutions from scratch, for everything," explains Wood. "We would get one component working, but when we moved onto the next, five new problems would arise. It was a moving target."

Flight muscles, for instance, don't come prepackaged for robots the size of a fingertip.

 

"Large robots can run on electromagnetic motors, but at this small scale you have to come up with an alternative, and there wasn’t one," says co-lead author Kevin Y. Ma, a graduate student at SEAS.

 

The tiny robot flaps its wings with piezoelectric actuators—strips of ceramic that expand and contract when an electric field is applied. Thin hinges of plastic embedded within the carbon fiber body frame serve as joints, and a delicately balanced control system commands the  rotational motions in the flapping-wing robot, with each wing controlled independently in real-time.

 

At tiny scales, small changes in airflow can have an outsized effect on flight dynamics, and the control system has to react that much faster to remain stable.

more...
No comment yet.