Imagine if you could take living cells, load them into a printer, and squirt out a 3D tissue that could develop into a kidney or a heart. Scientists are one step closer to that reality, now that they have developed the first printer for embryonic human stem cells.
In a new study, researchers from the University of Edinburgh have created a cell printer that spits out living embryonic stem cells. The printer was capable of printing uniform-size droplets of cells gently enough to keep the cells alive and maintain their ability to develop into different cell types. The new printing method could be used to make 3D human tissues for testing new drugs, grow organs, or ultimately print cells directly inside the body.
We've been modifying our appearance ever since we first figured out how to pierce skin with wood and bones. Today, our tendency to twist, morph, and expand upon our naturally given forms is very much alive and well, one that's been best expressed by the radical body modification community. And now, owing to the onset of new technologies, this subculture is ready to take body modification to further extremes.
Neil Harbisson can only see shades of grey. So his prosthetic eyepiece, which he calls an “eyeborg”, interprets the colours for him and translates them into sound. Harbisson’s art sounds like a kind of inverse synaesthesia. But where synaesthetes experience numbers or letters as colours or even “taste” words, for example, Harbisson’s art is down to a precise transposition of colour into sound frequencies. As a result, he is able to create facial portraits purely out of sound, and he can tell you that the colour of Mozart’s music is mostly yellow. Liz Else caught up with him at the TEDGlobal conference.
When did you realise you were colour blind?
When I was a kid they noticed that I had a big problem with colour blindness. They thought it was the normal red-green type, but it wasn't. Eventually, when I was 11 years old, they diagnosed me with achromatopsia, which means I can only see shades of grey. About one in 33,000 people have this type of colour blindness.
What is the gadget you are wearing? It's a sensor that lets me “see” colours.
How does it work?
Colour is basically hue, saturation, and light. Right now, I can see light in shades of grey, but I can’t see its saturation or hue. This gadget detects the light’s hue, and converts the light into a sound frequency that I can hear as a note [wavelength is inversely proportional to frequency so it can easily convert the wavelength of the light into a sound frequency]. It also translates the saturation of the colour into volume. So if it’s a vivid red I will hear it more loudly.
All the translation happens in a chip on the back of my neck - it's all held by pressure onto the bone. It stays there all the time when I go to bed. In September I'm having it osteointegrated - which means that part of the device will be put inside my bone in a hospital in Barcelona and then the sound will resonate much better then. It took a year to convince them that it was ethical and part of me.
Broadcast (2012) Adam Rutherford meets a new creature created by American scientists, the spider-goat. It is part goat, part spider, and its milk can be used to create artificial spider's web. It is part of a new field of research, synthetic biology, with a radical aim: to break down nature into spare parts so that we can rebuild it however we please. This technology is already being used to make bio-diesel to power cars. Other researchers are looking at how we might, one day, control human emotions by sending 'biological machines' into our brains.
Cyborgs in Latin America explores the ways cultural expression in Latin America has grappled with the changing relationships between technology and human identity.
The book takes a literary and cultural studies approach in examining narrative, film and advertising campaigns from Argentina, Bolivia, Chile, Mexico and Uruguay by such artists as Ricardo Piglia, Edmundo Paz Soldán, Carmen Boullosa and Alberto Fuguet among others.
Using and criticizing theoretical models developed by Katherine Hayles, Donna Haraway, Gilles Deleuze and Michel Foucault, the book will appeal to specialists and students of Latin American Studies; Posthuman Theory; and Literature, Science and Technology Studies.
It's like something out of Kafka. Anti-science anarchists in Italy appear to be ramping up their violent and frankly surreal campaign. Having claimed responsibility for shooting the boss of a nuclear engineering company in Genoa, the group has vowed to target Finmeccanica, the Italian aerospace and defence giant.
In a diatribe sent on 11 May to Corriere della Sera newspaper on 11 May, the Olga Cell of the Informal Anarchist Federation International Revolutionary Front said it shot Roberto Adinolfi, head of Ansaldo Nucleare, in the leg four days earlier. "With this action of ours, we return to you a tiny part of the suffering that you, man of science, are pouring into this world," the statement said. It also pledged a "campaign of struggle against Finmeccanica, the murderous octopus". Ansaldo is one of Finmeccanica's many offshoots.
The cell has previously targeted nanotechnology researchers, and in 2010 it tried to bomb an IBM lab in Zurich, Switzerland. The attempt resulted in three conspirators being caught and jailed.
(Originally published at URBNFUTR- graciously allowing me to republish here)
That we are intimate with the world is not news. That we have extended this intimacy to our tools is a reality; the idea that we are becoming cyborgs is already here.
There is nothing mysterious or futuristic about being a cyborg, part machine part biological organism. Using our smartphones to remember our appointments, a long list of telephone numbers, addresses and shopping lists is a cyborg activity. The extension and externalization of our memories in electronic devices makes us de-facto modern day cyborgs.
Deploying devices to upgrade and extend our insufficient neuro performance is only part of what we do as cyborgs. We use these extensions for training, for life tracking, for calorie counts, for sleep, you name it, there’s an app for that. But lest we forget, we also have our loves, our cares and our motivations to deal with, our desires both of the flesh and of our imagination. Also these are slowly coming into this symbiotic relationship.
The accelerated tooling times we are living in, allow us to gradually expand the notion of what it means to be human. For, make no mistake; we are far from being similar to our ancestors. A modern day, urbanized, cyberneticaly hyperconnected human, exists in a state of interdependence and sense extension, the like of which no one could have predicted.
LEFT your phone at home again? A solution is at hand: make sure it is with you at all times by having it implanted in your arm.
But given the opportunity, would you want your gadget to be a permanent part of you? The question may need answering sooner than you think.
Researchers at Autodesk, a software company in Toronto, Canada, checked to see whether the methods we currently use to interface with our gadgets work when the device is implanted in human tissue. The answer was a resounding "yes".
A button, an LED and a touch sensor all functioned appropriately when embedded under the skin of a cadaver's arm. The team was even able to communicate transcutaneously using a Bluetooth connection and charge the electronics wirelessly.
"That's the bottom line," says Christian Holz of the Autodesk team, who presented the work this week at the Conference on Human Factors in Computing Systems in Austin, Texas. "Traditional user interfaces work through the skin."
AR SPOT is an augmented-reality authoring environment for children. An extension of MIT’s Scratch project, this environment allows children to create experiences that mix real and virtual elements. Children can display virtual objects on a real-world scene observed through a video camera, and they can control the virtual world through interactions between physical objects.
This project aims to expand the range of creative experiences for young authors, by presenting AR technology in ways appropriate for this audience. In this process, we investigate how young children conceptualize augmented reality experiences, and shape the authoring environment according to this knowledge.
"A number of life-support machines are connected to each other, circulating liquids and air in attempt to mimic a biological structure...
(...) Cohen has long been investigating how machines, peripherals and even animals can work as extension of the body or substitutes of body parts. This time however, the human body has been removed from the scene. Yet, its presence and fragility can still be felt...
(...) Far from being just assemblages of tubes and circuits, the machines intersect with our culture, fears and beliefs..." - Regine
Entwined, Enmeshed, Entangled – Three modes of ‘being’ pertinent to our cyborgization process
By redesigning the conceptual landscape of our networked inter-relationality we may finally disentangle ourselves from the all-pervading occlusion of the cyborgization process and allow a fresh recognition of the manifold human sensorium extended in hyperconnectivity. In the re-conceptualizing of our cyber existence we may need relinquish a few cherished objects of identity such as man machine interface, virtuality and man machine co-existence but more importantly the dualistic distinction between ‘real’ life and our virtual extensions as existence. All of these descriptive objects of identity I suggest should become ‘naturalized’ in a new cyber-existential language.
This is the first part of a three pronged approach to what I believe is the foundation of a future philosophy of and for the hyperconnected individual. I will try to show that these three modes of beingness are the quintessential infrastructures necessary for a future of a technological civilization aiming for the firmament of freedom and equality, personal responsibility and open access culture. A civilization, which roots, we currently inhabit but that promises changes to the perception of ourselves, the understanding of the universe and the manner by which we may develop in tandem. The three lines of approach that will be used are: Entwinement, Enmeshment, and Entanglement. Each of these terms represents a similar but different manner to realize the state of affairs of hyperconnectivity as the threshold infrastructure in the process of becoming a citizen of the future, a cyborg netizen and perhaps a posthuman.Entwinement, Enmeshment and Entanglement each represent a different level of intimacy in the infocology (see lexical index) one exists in and partakes of. The three terms offered here are suggestions for an illustrative strategy that will allow a deeper and more accurate description of the state of affairs of our cyber existence. Each of these terms will be expanded upon later, for now suffice it to say that the terms are distinguished primarily by the amount, depth and extensiveness of the connectivity between minds in the hyperconnected infosphere. Entwinement stands for the lowest level, Enmeshment for the medium level and Entanglement for the highest or deepest level.
What’s all the buzz about? Bee inspired robots are coming and they aren’t just for pollinating plants.
A number of researchers are investigating the development of artificial robotic bees, others are developing artificial bee brains and bee based algorithms, and meanwhile a full connectome map of the honey bee brain has been developed. Artificial insect minds may be the first true Artificial General Intelligences (AGIs) available for commercial use.
h+ Magazine is a new publication that covers technological, scientific, and cultural trends that are changing human beings in fundamental ways.
Shawn Sarver took a deep breath and stared at the bottle of Listerine on the counter. “A minty fresh feeling for your mouth... cures bad breath,” he repeated to himself, as the scalpel sliced open his ring finger. His left arm was stretched out on the operating table, his sleeve rolled up past the elbow, revealing his first tattoo, the Air Force insignia he got at age 18, a few weeks after graduating from high school. Sarver was trying a technique he learned in the military to block out the pain, since it was illegal to administer anesthetic for his procedure.
“A minty fresh feeling... cures bad breath,” Sarver muttered through gritted teeth, his eyes staring off into a void.
Tim, the proprietor of Hot Rod Piercing in downtown Pittsburgh, put down the scalpel and picked up an instrument called an elevator, which he used to separate the flesh inside in Sarver’s finger, creating a small empty pocket of space. Then, with practiced hands, he slid a tiny rare earth metal inside the open wound, the width of a pencil eraser and thinner than a dime. When he tried to remove his tool, however, the metal disc stuck to the tweezers. “Let’s try this again,” Tim said. “Almost done.”
The implant stayed put the second time. Tim quickly stitched the cut shut, and cleaned off the blood. “Want to try it out?” he asked Sarver, who nodded with excitement. Tim dangled the needle from a string of suture next to Sarver’s finger, closer and closer, until suddenly, it jumped through the air and stuck to his flesh, attracted by the magnetic pull of the mineral implant.
“I’m a cyborg!” Sarver cried, getting up to join his friends in the waiting room outside. Tim started prepping a new tray of clean surgical tools. Now it was my turn.
Remember that scene in Minority Report when the spider robots stalk Tom Cruise to his apartment and scan his iris to identify him?
Things could have turned out so much better for Cruise had he been wearing a pair of contact lenses embossed with an image of someone else’s iris.
New research being released this week at the Black Hat security conference by academics in Spain and the U.S. may make that possible.
The academics have found a way to recreate iris images that match digital iris codes that are stored in databases and used by iris-recognition systems to identify people. The replica images, they say, can trick commercial iris-recognition systems into believing they’re real images and could help someone thwart identification at border crossings or gain entry to secure facilities protected by biometric systems.
Ant Ballet is a six-year research project into control systems, paranoia and dancing insects. The project is separated into four phases.
Phase I (2010-2012) included thorough research into ants and control systems, synthesis of ant pheromones and testing of these systems with live ants in Barcelona. Through use of synthesised pheromones (Z9:16 Ald Hexadecenal), a robotic arm lays trails which cause ants to move in a different way to their natural foraging behaviour. This phase proves the viability of the research and technologies.
The first live performance of Ant Ballet. Anticipated in Brazil as part of Pestival, 2013.
Development of intercontinental ant telecommunication devices.
An interdisciplinary group of 22 scientists, combining paleontological evidence with ecological modeling, has concluded that the earth appears headed toward catastrophic and irreversible environmental changes.
Their report, in the June 7 issue of the journal Nature, describes an exponentially increasing rate of species extinctions, extreme climate fluctuations, and other threats that together risk a level of upheaval not seen since the large-scale extinctions 65 million years ago that killed off the dinosaurs.
The lead author of the report is Anthony D. Barnosky, a professor of integrative biology at the University of California at Berkeley, which coordinated the work in an 18-month project that inaugurated the university's Berkeley Initiative in Global Change Biology.
The report's conclusions center on a measure of the amount of the earth's land surface that has been transformed by people, from forests and prairies to uses such as cornfields and parking lots (...)
text by By Paul Basken | image Blue Horizon by Naccarato
The point is to show how advances in imaging and data visualization technologies enable inter-disciplinary research which just a decade ago would have been impossible to conduct. There is also a somewhat artistic quality to these images, which reinforces the notion of data visualization being both art and science.
The University of Hawaii at Mānoa has developed an optically controlled microrobot system.The microrobots consist of very tiny (0.1 to 0.5 mm in diameter) air bubbles inside of a fluid-filled chamber. Light is used to heat the surface of the chamber, which generates a force that moves around the microrobots. The microrobots can be used move around objects that are less than a millimeter in size. This can be useful for building structures made up of living cells. For more information see our lab website at http://www-ee.eng.hawaii.edu/~aohta/research.html
Academics & researchers are harnessing our human love of games to crowdsource vast and complex tasks. And when the capacity of the human brain is combined with digital computing power, the possibilities appear truly staggering.
Legendary Australian performance artist Stelarc is known for going to extremes, from aggressive voluntary surgeries and robotic third arms to flesh-hook suspensions and prosthetics. For more than four decades, he has used his body as a canvas for art on the very edge of human experience: He once ingested a “stomach sculpture” that could have killed him.
The leafy streets of Berkeley seem like the last place you’d find Stelarc, who comes off like something out of a sci-fi novel. But here he was on a recent sunny day, in town to give talks at UC Berkeley and in San Francisco (with his old friend Mark Pauline of Survival Research Labs). Stelarc walked down Telegraph Avenue wearing all black, looking traumatized by the vestiges of 1960s hippie culture surrounding him.
“I’ve never really been interested in sci-fi speculation,” he said in an interview with Wired in a dimly lit Berkeley cafe. Artists, he said, can be “early alert warning systems,” generators of “contestable futures — possibilities that can be examined, evaluated, perhaps appropriated, often discarded.”
A series of films about how humans have been colonized by the machines they have built. Although we don’t realize it, the way we see everything in the world today is through the eyes of the computers. It claims that computers have failed to liberate us and instead have distorted and simplified our view of the world around us.