We're Underestimating the Risk of Human Extinction | Gentlemachines | Scoop.it
An Oxford philosopher argues that we are not adequately accounting for technology's risks -- but his solution to the problem is not for Luddites.

"Bostrom, who directs Oxford's Future of Humanity Institute, has argued over the course of several papers that human extinction risks are poorly understood and, worse still, severely underestimated by society. Some of these existential risks are fairly well known, especially the natural ones. But others are obscure or even exotic. Most worrying to Bostrom is the subset of existential risks that arise from human technology, a subset that he expects to grow in number and potency over the next century."