Brian Christian is the author of The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive. His writing has appeared in The Atlantic, Wired, The Wall Street Journal, Gizmodo, and The Guardian.
Interxion today unveiled "sleeping pods" at its London data center campus, allowing staff to sleep amongst the racks to ensure that the facility will be fully staffed throughout the 2012 Summer Olympic Games.
It is now easy to love our selves as cyborgs, cyborg love is here, embrace it. The essential notion of Cyborg love is constructive transgression, a spreading out of our sense perception, into new domains of feeling. More than stimulating, less than exciting, slightly uncomfortable, the cyborg symbiosis we are moving into is an opportunity to expand our humanness. According to Amber Case who studies Cyborg Anthropology:
“It’s not that machines are taking over – it’s just that they are helping us be more human”
Mobile computing tools for urban and construction planning have developed dramatically over the past few years. Even by global standards, the progress made at VTT Technical Research Centre of Finland has been remarkable. Augmented Reality technology developed by VTT has enabled the placement of office and residential construction in the appropriate environment and the study of the overall concept on-site, even at the planning stage, for example on a smart phone display.
A few days ago Bruce Sterling posted an "Essay on the New Aesthetic", summing up his most recent thoughts after a panel at SXSW, similarly titled "The New Aesthetic - Seeing like digital devices". The focal point of definition for this New Aesthetic is well documented and a gestalt emerges quite quickly on the New Aesthetic tumblr, a juxtaposition of quotes, images, sensations, videos highlighting myriad examples of that which its curators are recognizing is already happening.
In short, New Aesthetic touches in some sense the bleeding of the virtual dimension into the actual and our increasing reflection of our own methods of sensing in machines, and vice versa. What's surprising about Bruce's usually sardonic critique of modern culture is that he acknowledges his own excitement and novelty with NA, that perhaps it does indeed holds the potential for something new...
Step aside, AT&T and Verizon. A new privacy-protecting Internet service and telephone provider still in the planning stages could become the ACLU's dream and the FBI's worst nightmare. Read this blog post by Declan McCullagh on Privacy Inc..
Computer chips have stopped getting faster. In order to keep increasing chips’ computational power at the rate to which we’ve grown accustomed, chipmakers are instead giving them additional “cores,” or processing units.
A deeply fascinating and, by measures, terrifying milestone on the path to truly ubiquitous networked computation... The BBC notes that security firm, McAfee, was able to remotely compromise a wireless, implantable insulin pump, thereby propelling the conversation about medical implants into the realm of cyberwarfare. Another McAfee researcher claims to have "captured the signal" of an implanted heart defibrillator, only to have thrown the signal right back at the device causing it to shut off mercilessly. As a class, such devices are increasingly being implanted into us fragile apes in order to contain the threats of heart disease, diabetes, and other slow-moving but potentially fatal conditions that might thwart our god-like ascent into techno-superiority. But grok this, ye mighty, and despair:
My first three titles could probably come under the ‘sub-genre’ of R&D scifi … along with others. Why? Because R&D plays such a pivotal role in the narratives.
What is R&D scifi/SF? Google doesn’t come up with much to help us out on this one, although the answer is probably a bit obvious. Anyway…
1) R&D scifi is any scifi that has R&D as a central or pivotal focus throughout the text or at least at crucial points, which then has a significant impact on the outcome of the narrative. Something like this…
The discovery, using state-of-the-art informatics tools, increases the likelihood that it will be possible to predict much of the fundamental structure and function of the brain without having to measure every aspect of it.
Computer scientist Hava Siegelmann of the Biologically Inspired Neural & Dynamical Systems (BINDS) Laboratory at the University of Massachusetts Amherst, an expert in neural networks, has taken Alan Turing’s work to its next logical step. She is translating her 1993 discovery of what she has dubbed “Super-Turing” computation into an adaptable computational system that learns and evolves, using input from the environment in a way much more like our brains do than classic Turing-type computers.
“This model is inspired by the brain,” she says. “It is a mathematical formulation of the brain’s neural networks with their adaptive abilities.” The authors show that when the model is installed in an environment offering constant sensory stimuli like the real world, and when all stimulus-response pairs are considered over the machine’s lifetime, the Super Turing model yields an exponentially greater repertoire of behaviors than the classical computer or Turing model. They demonstrate that the Super-Turing model is superior for human-like tasks and learning.