Technology allows us a "read later" mentality. We don't seem to want it.
Your new post is loading...
Your new post is loading...
A new solution to the prisoner’s dilemma, a classic game theory scenario, has created new puzzles in evolutionary biology.
When the manuscript crossed his desk, Joshua Plotkin, a theoretical biologist at the University of Pennsylvania, was immediately intrigued. The physicist Freeman Dyson and the computer scientist William Press, both highly accomplished in their fields, had found a new solution to a famous, decades-old game theory scenario called the prisoner’s dilemma, in which players must decide whether to cheat or cooperate with a partner. The prisoner’s dilemma has long been used to help explain how cooperation might endure in nature. After all, natural selection is ruled by the survival of the fittest, so one might expect that selfish strategies benefiting the individual would be most likely to persist. But careful study of the prisoner’s dilemma revealed that organisms could act entirely in their own self-interest and still create a cooperative community.
Press and Dyson’s new solution to the problem, however, threw that rosy perspective into question. It suggested the best strategies were selfish ones that led to extortion, not cooperation.
Plotkin found the duo’s math remarkable in its elegance. But the outcome troubled him. Nature includes numerous examples of cooperative behavior. For example, vampire bats donate some of their blood meal to community members that fail to find prey. Some species of birds and social insects routinely help raise another’s brood. Even bacteria can cooperate, sticking to each other so that some may survive poison. If extortion reigns, what drives these and other acts of selflessness?
A new brain-scanning technique could change the way scientists think about human focus.
Human attention isn’t stable, ever, and it costs us: lives lost when drivers space out, billions of dollars wasted on inefficient work, and mental disorders that hijack focus. Much of the time, people don’t realize they’ve stopped paying attention until it’s too late. This “flight of the mind,” as Virginia Woolf called it, is often beyond conscious control.
So researchers at Princeton set out to build a tool that could show people what their brains are doing in real time, and signal the moments when their minds begin to wander. And they've largely succeeded, a paper published today in the journal Nature Neuroscience reports. The scientists who invented this attention machine, led by professor Nick Turk-Browne, are calling it a “mind booster.” It could, they say, change the way we think about paying attention—and even introduce new ways of treating illnesses like depression.
Here’s how the brain decoder works: You lie down in an a functional magnetic resonance imaging machine (fMRI)—similar to the MRI machines used to diagnose diseases—which lets scientists track brain activity. Once you're in the scanner, you watch a series of pictures and press a button when you see certain targets. The task is like a video game—the dullest video game in the world, really, which is the point. You see a face, overlaid atop an image of a landscape. Your job is to press a button if the face is female, as it is 90 percent of the time, but not if it’s male. And ignore the landscape. (There’s also a reverse task, in which you’re asked to judge whether the scene is outside or inside, and ignore the faces.)
Extreme weather arising from a climate phenomenon in the Pacific Ocean will get much worse as the world warms, according to climate modelling.
Extreme weather arising from a climate phenomenon in the Pacific Ocean will get much worse as the world warms, according to climate modelling.
Parts of the world will have weather patterns that switch between extremes of wet and dry, say scientists.
The US will see more droughts while flooding will become more common in the western Pacific, research suggests.
The study, in Nature Climate Change, adds to a growing body of evidence over climate change and extreme weather.
The latest data - based on detailed climate modelling work - suggests extreme La Nina events in the Pacific Ocean will almost double with global warming, from one in 23 years to one in 13 years.
Most will follow extreme El Nino events, meaning frequent swings between opposite extremes from one year to the next.
Lead researcher Dr Wenju Cai from the Commonwealth Scientific and Industrial Research Organisation (CSIRO), Australia, said this would mean an increase in the occurrence of "devastating weather events with profound socio-economic consequences".
"El Nino and La Nina can be a major driver of extreme weather," he said. "We are going to see these extreme weather [events] become more frequent."
El Nino and La Nina are complex weather patterns arising from variations in ocean temperatures in the Equatorial Pacific. They can have large-scale impacts on global weather and climate.
La Niña is sometimes referred to as the cold phase and El Niño as the warm phase of this natural climate phenomenon.
The researchers explored ways to improve “vertically integrated nanogenerator” energy-harvesting chips based on ZnO. They inserted an aluminum-nitride insulating layer into a conventional energy-harvesting chip based on ZnO and found that the added layer increased the output voltage a whopping 140 to 200 times (from 7 millivolts to 1 volt, in one configuration). This increase was the result of the high dielectric constant (increasing the electric field) and large Young’s modulus (stiffness).
Sir Tim Berners-Lee calls for net access to be treated as a basic right, following a report suggesting great inequalities online.
The web is becoming less free and more unequal, according to a report from the World Wide Web Foundation.
Its annual web index suggests web users are at increasing risk of government surveillance, with laws preventing mass snooping weak or non-existent in over 84% of countries.
It also indicates that online censorship is on the rise.
The report led web inventor Sir Tim Berners-Lee to call for net access to be recognised as a human right.
The World Wide Web Foundation, led by Sir Tim, measured the web's contribution to the social, economic and political progress of 86 countries.
Other headline findings from the report include:74% of countries either lack clear and effective net neutrality rules and/or show evidence of traffic discrimination62% of countries report that the web plays a major role in sparking social or political action74% of countries are not doing enough to stop online harassment of women
The index ranked countries around the world in terms of:universal accessrelevant content and usefreedom and opennessempowerment
Four of the top five were Scandinavian, with Denmark in first place, Finland second and Norway third. The UK came fourth, followed by Sweden.
"The richer and better educated people are, the more benefit they are gaining from the digital revolution," said Anne Jellema, chief executive of the World Wide Web Foundation, and the lead author of the report.
Limitless movie poster (credit: Virgin Produced) Is it possible to rapidly increase (or decrease) the amount of information the brain can store?
The study is described in an open-access paper in Cell Reports. Funding was provided by he Canadian Institutes of Health Research (CIHR), the Natural Sciences and Engineering Research Council of Canada, and U.S. National Institutes of Health.
A paper that claims that smoking causes cancer or that terrorism is caused by poverty is valuable only if it turns out to be a good explanation of cancer or terrorism. As recently noted by Philip Gerrans at the University of Adelaide, “[It] is why an original and true explanation is the gold standard of academic markets.”
Natural selection isn’t nearly enough to explain how life created so many innovations so fast. Fortunately for us, writes SFI External Professor Andreas Wagner in a new book, Nature had something else up her sleeve: robustness.
Even in organisms with relatively few genes, the number of possible combinations of those genes is unimaginably enormous — many, many orders of magnitude greater than the number of hydrogen atoms in the Universe. Even 3.7 billions years isn’t enough to search all those possibilities at random and find all the forms of life we have today.
In Arrival of the Fittest: Solving Evolution's Greatest Puzzle (Current Hardcover , October 2, 2014), Wagner shows how robustness, long a subject of interest at SFI, helped solve the problem. Metabolic systems, protein interactions, and gene regulation networks share a particular kind of robustness: even drastic changes to the underlying structure leaves their operations unchanged. For example, the complex of chemical reactions that metabolize glucose in E. coli can overlap by as little as 20 percent and still function perfectly well.
Read a review of Wagner's book by Mark Pagel in Nature (October 1, 2014)
The Simons Foundation awarded a grant to a team of researchers at the University of California, Santa Cruz to develop a graph-based human reference genome.
In 2003, the Human Genome Project (HGP) successfully mapped a large portion of the human genome. Since that time, the HGP’s genomic map — a linear sequence of the four DNA bases — has served as a single reference genome for all novel sequencing data. But while immensely valuable, the HGP’s reference genome does not account for all genomic variation, making it inadequate for representing humanity as a whole, which encompasses many and complicated genetic variants.
“In the decade since the HGP announced the completion of a major portion of their work, the vast improvement in our understanding of the complexity of the genome, the rapid improvement of technology for sequencing genomes and the increasingly broad application of this technology have created a need to rethink how scientists describe to one another the rich patterns of genomic variations uncovered by cutting-edge experiments,” says Nick Carriero, group leader for software development at the Simons Center for Data Analysis. “Given that study of variation is at the heart of most medical and life sciences genome-based research, addressing this challenge is critical to advancing these fields.”
A startup called Clarifai is able to search video images in a matter of seconds and figure out what's inside them. Watch how it works in this demo video.
We’ve glimpsed the future of online search, and here it is: a 17-second video of a puppy brought to you by Clarifai, a tiny startup that specializes in artificial intelligence.
The video (above) shows the puppy looking very cute as it nuzzles with its female owner, but the interesting stuff is happening in the squiggly lines below. Using a database of 10,000 visual categories Clarifai has built over the past six months, the company’s software tracks the images that appear in the video, automatically describing it with words like “dog,” “female,” “eyes,” and even “cute.”
The idea is that you can then search for these words, and the software will tell you when the corresponding images appear.
It’s part of a trend in artificial intelligence, called deep learning, that’s sweeping through technology giants, giving us software that approaches human levels of perception. Google uses it to boost Android’s voice recognition. Microsoft uses it in a Star Trek-like instant language translator. Facebook is using it to improve its automatic tagging of everyone in your photos. And soon, deep learning will change how we search through videos, making it possible for machines to analyze clips and quickly understand what’s within them.
The original planetary boundaries were conceived in 2009 by a team lead by Johan Rockstrom, also of the Stockholm Resilience Centre. Together with his co-authors, Rockstrom produced a list of nine human-driven changes to the Earth’s system: climate change, ocean acidification, stratospheric ozone depletion, alteration of nitrogen and phosphorus cycling, freshwater consumption, land use change, biodiversity loss, aerosol and chemical pollution. Each of these nine, if driven hard enough, could alter the planet to the point where it becomes a much less hospitable place on which to live.
We call it white matter because the axons are wrapped in a fatty layer, the myelin, which ensures better neuronal communication – the way information is transferred around the brain. The myelin functions as an “insulation” that prevents information “leaking” from the axon during transfer.
The incorporated entity has a fancy name and all, but it’s less a standard company than a group of about 100 engineers all over the country who spend their free time spitballing ideas in exchange for stock options. That said, this isn’t a Subreddit trying to solve the Boston Marathon bombing. These gals and guys applied for the right to work on the project (another 100 or so were rejected) and nearly all of them have day jobs at companies like Boeing, NASA, Yahoo!, Airbus, SpaceX, and Salesforce. They’re smart. And they’re organized.
Owing to the extreme conditions on the Venusian surface, it's going to be quite some time before a human ever steps foot on that planet. That's why NASA is developing a plan to deploy human-occupied airships in Venus's upper atmosphere. And yes, permanent occupation is the ultimate goal.
Via Alessio Erioli
Stem-cell technology is being used to grow fresh human blood in the laboratory – but don’t hand in your donor card just yet-
In 2007, a team of researchers from the UK and Irish Blood services responded to an oddly specific call from the US military. They wanted scientists to help them build a machine, no bigger than two and a half washing machines, that could be dropped from a helicopter on to a battle field and generate stem-cell-derived blood for injured soldiers.
The team’s application was not successful, but they refocused their efforts and set off on a more utopian mission – to develop a similar technology to create a limitless supply of clean, laboratory-grown blood for use in clinics around the world. Using blood made from stem cells would unshackle blood services from the limits of human supply, and any risk of infection would be removed.
They’ve been working with embryonic or induced pluripotent stem cells, which, given the right culture conditions, can differentiate into any type of cells. Still at least a year from human testing, the team have tweaked their protocol to select only red blood cells.
“Because we make them from human cells they are as nature intended,” says Joanne Mountford, of the University of Glasgow, who leads the project along with Marc Turner, the medical director of the Scottish National Blood Transfusion Service.
“It’s the same thing your body makes but we’re just doing it in a lab.”
Kurt Andersen wonders: If the Singularity is near, will it bring about global techno-Nirvana or civilizational ruin?
Artificial intelligence is suddenly everywhere. It’s still what the experts call “soft A.I.,” but it is proliferating like mad. We’re now accustomed to having conversations with computers: to refill a prescription, make a cable-TV-service appointment, cancel an airline reservation—or, when driving, to silently obey the instructions of the voice from the G.P.S.
But until the other morning I’d never initiated an elective conversation with a talking computer. I asked the artificial-intelligence app on my iPhone how old I am. First, Siri spelled my name right, something human beings generally fail to do. Then she said, “This might answer your question,” and displayed my correct age in years, months, and days. She knows more about me than I do. When I asked, “What is the Singularity?,” Siri inquired whether I wanted a Web search (“That’s what I figured,” she replied) and offered up this definition: “A technological singularity is a predicted point in the development of a civilization at which technological progress accelerates beyond the ability of present-day humans to fully comprehend or predict.”
Siri appeared on my phone three years ago, a few months after the IBM supercomputer Watson beat a pair of Jeopardy! champions. Since then, Watson has been speeded up 24-fold and fed millions of pages of medical data, thus turning the celebrity machine into a practicing cancer diagnostician. Autonomous machines now make half the trades on Wall Street, meaning, for instance, that a firm will often own a given stock for less than a second—thus the phrase “high-frequency trading,” the subject of Flash Boys, Michael Lewis’s book earlier this year. (Trading by machines is one reason why a hoax A.P. tweet last year about a White House bombing made the Dow Jones Industrial Average suddenly drop 146 points.) Google’s test fleet of a couple dozen robotic Lexuses and Priuses, after driving more than 700,000 miles on regular streets and highways, have been at fault in not a single accident. Meanwhile, bionic and biological breakthroughs are radically commingling humans and machines. Last year, a team of biomedical engineers demonstrated a system that enabled people wearing electrode-embedded caps to fly a tiny drone helicopter with their minds.
Machines performing unimaginably complicated calculations unimaginably fast—that’s what computers have always done. Computers were called “electronic brains” from the beginning. But the great open question is whether a computer really will be able to do all that your brain can do, and more. Two decades from now, will artificial intelligence—A.I.—go from soft to hard, equaling and then quickly surpassing the human kind? And if the Singularity is near, will it bring about global techno-Nirvana or civilizational ruin?
An important and well written read..
A Mediterranean diet is a better way of tackling obesity than calorie counting, leading doctors say.
A Mediterranean diet may be a better way of tackling obesity than calorie counting, leading doctors have said.
Writing in the Postgraduate Medical Journal (PMJ), the doctors said a Mediterranean diet quickly reduced the risk of heart attacks and strokes.
And they said it may be better than low-fat diets for sustained weight loss.
Official NHS advice is to monitor calorie intake to maintain a healthy weight.
Last month NHS leaders stressed the need for urgent action to tackle obesity and the health problems that often go with it.
The PMJ editorial argues a focus on food intake is the best approach, but it warns crash dieting is harmful.
Signatories of the piece included the chair of the Academy of Medical Royal Colleges, Prof Terence Stephenson, and Dr Mahiben Maruthappu, who has a senior role at NHS England.
They criticise the weight-loss industry for focusing on calorie restriction rather than "good nutrition".
Analogue” and “digital” are the two polar opposites of our modern world. The word “analogue” has become our catch-all term for what we see as slow, one-way and limited in functional possibilities; while “digital” is our synonym for the dynamic, interactive and fluid.
Conductive inks such as those produced by the British firm Bare Conductive mean that pen and ink can be used to make circuits – and a piece of paper could feasibly become a circuit board, much like that in a computer but infinitely more flexible and versatile.
Investor Peter Thiel has inspiring advice for wanna-be entrepreneurs, but he is unrealistic about where technology really comes from.
Is the technology investor Peter Thiel brilliant, or is he just strange? He is nothing if not industrious. Since he cofounded PayPal, in 1998, Thiel has had a hand in some of the most important and unexpected tech companies of our era. His success has made him an oracular presence in Silicon Valley.
Thiel’s contrarianism is notorious, and he appears to delight in saying or doing the unexpected, even at the risk of ridicule. Each year, his nonprofit gives a handful of college students $100,000 to drop out of school and pursue a risky startup. He has declared himself to be not only against taxes but against “the ideology of the inevitability of death.” And when the Seasteading Institute—a utopian group intent on building floating cities so as to escape the intrusions of government—sought funding a few years ago, Thiel ponied up half a million dollars.
If one wanted to emulate Peter Thiel’s success, would one have to do more than just the opposite of everyone else? His new book—a polished version of some lectures he gave at Stanford for aspiring entrepreneurs in 2012—suggests that there is such a creed as Thielism. His theories on what makes a good technology company and how such companies can improve society are by turns brazen, thoughtful, and precise; the challenge lies in separating the truth from the truthiness. Thiel insightfully diagnoses the failings of today’s technology (see Q&A), but the cures he suggests are questionable.
According to Thiel, most startups funded by his fellow Silicon Valley investors shouldn’t exist. All prospective entrepreneurs, he suggests, should ask themselves a simple and essential question: “What valuable company is nobody building?” If they don’t have an answer, they should do something else.