Systems Theory
Follow
Find tag "singularity"
3.1K views | +2 today
Systems Theory
theoretical aspects of (social) systems theory
Curated by Ben van Lier
Your new post is loading...
Your new post is loading...
Rescooped by Ben van Lier from Tracking the Future
Scoop.it!

Can we build an artificial superintelligence that won't kill us?

Can we build an artificial superintelligence that won't kill us? | Systems Theory | Scoop.it

At some point in our future, an artificial intelligence will emerge that's smarter, faster, and vastly more powerful than us. Once this happens, we'll no longer be in charge. But what will happen to humanity? And how can we prepare for this transition? 


Via Szabolcs Kósa
more...
No comment yet.
Rescooped by Ben van Lier from Tracking the Future
Scoop.it!

The Consequences of Machine Intelligence

The Consequences of Machine Intelligence | Systems Theory | Scoop.it

The question of what happens when machines get to be as intelligent as and even more intelligent than people seems to occupy many science-fiction writers. The Terminator movie trilogy, for example, featured Skynet, a self-aware artificial intelligence that served as the trilogy's main villain, battling humanity through its Terminator cyborgs. Among technologists, it is mostly "Singularitarians" who think about the day when machine will surpass humans in intelligence. The term "singularity" as a description for a phenomenon of technological acceleration leading to "machine-intelligence explosion" was coined by the mathematician Stanislaw Ulam in 1958, when he wrote of a conversation with John von Neumann concerning the "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue."


Via Szabolcs Kósa
more...
No comment yet.
Rescooped by Ben van Lier from Tracking the Future
Scoop.it!

Melanie Mitchell: AI and the Barrier of Meaning

Melanie Mitchell, Professor of Computer Science at Portland State University discusses her work on development of the Copycat, an AI computer program that makes analogies.


Via Szabolcs Kósa
more...
No comment yet.
Rescooped by Ben van Lier from Tracking the Future
Scoop.it!

The Consequences of Machine Intelligence

The Consequences of Machine Intelligence | Systems Theory | Scoop.it

The question of what happens when machines get to be as intelligent as and even more intelligent than people seems to occupy many science-fiction writers. The Terminator movie trilogy, for example, featured Skynet, a self-aware artificial intelligence that served as the trilogy's main villain, battling humanity through its Terminator cyborgs. Among technologists, it is mostly "Singularitarians" who think about the day when machine will surpass humans in intelligence. The term "singularity" as a description for a phenomenon of technological acceleration leading to "machine-intelligence explosion" was coined by the mathematician Stanislaw Ulam in 1958, when he wrote of a conversation with John von Neumann concerning the "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue."


Via Szabolcs Kósa
more...
No comment yet.