The promise that each generation will be better off than the last is a fundamental tenet of modern society. By and large, most advanced economies have fulfilled this promise, with living standards rising over recent generations, despite setbacks from wars and financial crises. In the developing world, too, the vast majority of people have started to experience sustained improvement in living standards and are rapidly developing similar growth expectations. But will future generations, particularly in advanced economies, realize such expectations? Though the likely answer is yes, the downside risks seem higher than they did a few decades ago.
At some point in our future, an artificial intelligence will emerge that's smarter, faster, and vastly more powerful than us. Once this happens, we'll no longer be in charge. But what will happen to humanity? And how can we prepare for this transition?
In a nutshell, the Technological Singularity is a term used to describe the theoretical moment in time when artificial intelligence matches and then exceeds human intelligence. The term was popularized by scifi writer Vernor Vinge, but full credit goes to the mathematician John von Neumann, who spoke of "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue."
By "not continue" von Neumann was referring to the potential for humanity to lose control and fall outside the context of its technologies. Today, this technology is assumed to be artificial intelligence, or more accurately, recursively-improving artificial intelligence (RIAI), leading to artificial superintelligence (ASI).
Because we cannot predict the nature and intentions of an artificial superintelligence, we have come to refer to this sociological event horizon the Technological Singularity — a concept that's open to wide interpretation, and by consequence, gross misunderstanding.
The present is surreal and one thing is for sure, the future will be weirder still. Anders Sandberg of the Future of Humanity Institute in Oxford and other thinkers mull over whether humans might become the pets of intelligent machines; what threat does epidemic disease pose, and why we feel such a need to predict the future anyway.
Director Biography: Ryan Harding is a photographer and film maker based in London.
Director: Ryan Harding Producer: Marianna Petrilli
J Craig Venter has been a molecular-biology pioneer for two decades. After developing expressed sequence tags in the 90s, he led the private effort to map the human genome, publishing the results in 2001. In 2010 the J Craig Venter Institute manufactured the entire genome of a bacterium, creating the first synthetic organism. Now Venter, author of Life at the Speed of Light: From the Double Helix to the Dawn of Digital Life, explains the coming era of discovery.
James D. Miller, Associate Professor of Economics at Smith College and author of Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World, discusses the economics of the singularity, or the point of time in which we'll either have computers that are smarter than people or we will have significantly increased human intelligence. According to Miller, brains are essentially organic computers, and, thus, applying Moore's law suggests that we are moving towards singularity. Since economic output is a product of the human brain, increased brainpower or the existence of computers smarter than humans could produce outputs we cannot even imagine. - another excellent interview by Adam Ford
Climate engineering-which could slow the pace of global warming by injecting reflective particles into the upper atmosphere-has emerged in recent years as an extremely controversial technology. A leading scientist long concerned about climate change offers a proposal for an easy fix to what is perhaps the most challenging question of our time. After decades during which very little progress has been made in reducing carbon emissions we must put this technology on the table and consider it responsibly.
David Keith is the Gordon McKay Professor of Applied Physics in the School of Engineering and Applied Sciences (SEAS) at Harvard University and Professor of Public Policy at the Harvard Kennedy School.
Once that great apocalyptic event—contagion, climate change, nuclear holocaust, zombies, whatever—drowns out the huddled masses of humanity, we can take solace in at least one thing: those who remain will have no shortage of suggestions from art and pop culture as to how best to carry on.
If it's a zombie scenario, they could, for instance, go Walking Dead and form a scrappy band and shack up in a prison. If it's disease, they could hack their bodies, adding Matt-Damon-in-Elysium-style cyborg arm implants to do combat with the rich. If it's rising sea levels, they could follow one Tokyo design firm's advice, and outfit themselves with artificial organs designed to make the human body more water-efficient.
As biology emerges as another generalised computing medium, future biological creations will, just like electronic computing, extend their reach into every aspect of our lives and into every industry, transforming them both to their very core. If our experience with cyberspace is any indication, these developments will unfold unpredictably, yet there are important lessons to be learned. The internet was built for redundancy, not security. As a result, we have the omnipresent spectre of cybercrime looming over us. Before we enter the age of programmable biology, we must contemplate what we might do differently to avoid the mistakes we made in our development of silicon-based computing. DNA is the common thread that runs through all living things. Without it, there is no life. As such, we have no alternative to seriously considering how we will protect the world's original operating system.
For centuries, the threat and selective use of brute force has steered the international balance of power. In the last couple decades, the system has increasingly accommodated economic power as a means of non-violent leverage between states. Now, says Singularity University’s Marc Goodman, we must add technology into the mix. Technological power is not new, of course, but information technology’s exponential pace and declining cost is changing how the global game is played and who the players are. Control of technology is passing from the richest states and governments to smaller groups and individuals, and the results are both inspiring and terrifying. As Goodman says, “The ability of one to affect many is scaling exponentially—and it’s scaling for good and it’s scaling for evil.”
On the eve of a technological breakthrough, an insignificant janitor and a prominent engineer are faced with a decision that will alter the course of humanity: the release of the first aware computer system into the world.
The Long-Term Future of AI (and what we can do about it): Daniel Dewey at TEDxViennaDaniel Dewey is a research fellow in the Oxford Martin Programme on the Impacts of Future Technology at the Future of Humanity Institute, University of Oxford. His research includes paths and timelines to machine superintelligence, the possibility of intelligence explosion, and the strategic and technical challenges arising from these possibilities.
Apocalyptic weapons are currently the domain of world powers. But this is set to change. Within a few decades, small groups — and even single individuals — will be able to get their hands on any number of extinction-inducing technologies. As shocking as it sounds, the world could be destroyed by a small team or a person acting alone. Here's how.
Imagine a city in space, a round structure miles across that millions of people would call home. Engineers working at NASA in the 1970s developed colorful proposals for permanent settlements in space, but their plans were shelved and forgotten. Decades later, a new generation of dreamers from high schools around the world aspire to mine asteroids, terraform other planets, and venture to the stars. The students have come together for a contest at NASA, and have big plans for the next millennium.
All the key barriers to the artificial synthesis of viruses and bacteria have been overcome, spawning a dizzying array of perils and promises. But as the scientific community forges ahead, the biosecurity establishment remains behind the curve.
"For 20 years James Barrat has created documentary films for National Geographic, the BBC, Discovery Channel, History Channel and public television. In 2000, during the course of his career as a film-maker, James interviewed Ray Kurzweil and Arthur C. Clarke. The latter interview not only transformed entirely Barrat's views on artificial intelligence, but also made him write a book on the technological singularity called Our Final Invention: Artificial Intelligence and the End of the Human Era. I read an advance copy of Our Final Invention and it is by far the most thoroughly researched and comprehensive anti-The Singularity is Near book that I have read so far. And so I couldn't help it but invite James on Singularity 1 on 1 so that we can discuss the reasons for his abrupt change of mind and consequent fear or the singularity. During our 70 minute conversation with Barrat we cover a variety of interesting topics such as: his work as a documentary film-maker who takes interesting and complicated subjects and makes them simple to understand; why writing was his first love and how he got interested in the technological singularity; how his initial optimism about AI turned into pessimism; the thesis of Our Final Invention; why he sees artificial intelligence more like ballistic missiles rather than video games; why true intelligence is inherently unpredictable "black box"; how we can study AI before we can actually create it; hard vs slow take-off scenarios; the positive bias in the singularity community; our current chances of survival and what we should do...
There has been much speculation about the future of humanity in the face of super-humanly intelligent machines. Most of the dystopian scenarios seem to be driven by plain fear that entities arise that could be smarter and stronger than us. After all, how are we supposed to know which goals the machines will be driven by? Is it possible to have “friendly” AI? If we attempt to turn them off, will they care? Would they care about their own survival in the first place?
Vehicles on the road today are already joining our larger “Internet of things.” They sync up with our phones through Bluetooth; they register on GPS satellites for navigation; and their mechanical difficulties can be diagnosed at a distance with services like On Star. But future technology that allows cars to talk to each other directly promises to be much more disruptive still. And that technology has gotten a new push in the last week—at the same time that its potential security weaknesses have been highlighted.
Synthetic biology moves us from reading to writing DNA, allowing us to design biological systems from scratch for any number of applications. Its capabilities are becoming clearer, its first products and processes emerging. Synthetic biology’s reach already extends from reducing our dependence on oil to transforming how we develop medicines and food crops. It is being heralded as the next big thing; whether it fulfils that expectation remains to be seen. It will require collaboration and multi-disciplinary approaches to development, application and regulation. Interesting times ahead!
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.