Paperhouses is an open source architecture platform that's bringing high-quality housing designs to the masses.
Hiring a world-class architect is something not many people can afford. Like high fashion or expensive art, the closest most of us will get to experiencing a designer-drawn home is by looking through the picture book sitting on your coffee table.
Architecture has long had an accessibility problem: You want a bespoke house? You’re gonna have to pony up a lot of money. In the process, good design has become a luxury; a snooty, out-of-reach idea that only the rich have access to, which is actually the exact opposite of what good design should be. But what if architecture behaved more like technology? Can you expand the reach of quality design by applying the same principles behind open source code to architecture? Ask Joana Pacheco, and the answer will be a resounding yes.
Rob Nail walks into the room looking like a Silicon Valley Doctor Who as played by David Tennant - tailored suit, 3D-printed trainers and the Californian twist on the sonic screwdriver, Google Glass.
But despite spending most of his days predicting what the future will look like, he doesn't want to become a time lord.
"I feel more like a robot," says the chief executive of the Singularity University (SU).
He thinks that the gap between humans and robots is closing as biology and silicon increasingly collide.
He reels off examples.
Bionic eyes that combine a Google Glass device with a tiny electrode in the retina and will be available in the US for partially-sighted people in a few weeks' time. It is only a matter of time before they filter down to the wider public. "Useful for pilots.," he says.
He describes apps for the next-generation Google Glass that will allow users to read the heat maps of people's faces to tell if someone is lying or not. "They will either be banned or become a must-have in the world's boardrooms."
And the first re-engineered human is not far off, either. "It will come within the next year, probably initially to offset some disease," he predicts.
"If you want to be at the head of the class in future you are going to have to be enhanced," he says matter-of-factly.
MIT Technology Review identifies the 10 most important technology milestones of the past year.
Think of the most frustrating, intractable, or simply annoying problems you can imagine. Now think about what technology is doing to fix them. That’s what we did in coming up with our annual list of 10 Breakthrough Technologies. We’re looking for technologies that we believe will expand the scope of human possibilities.
Our definition of a breakthrough is simple: an advance that gives people powerful new ways to use technology. It could be an intuitive design that provides a useful interface (see “Smart Watches”) or experimental devices that could allow people who have suffered brain damage to once again form memories (“Memory Implants”). Some could be key to sustainable economic growth (“Additive Manufacturing” and “Supergrids”), while others could change how we communicate (“Temporary Social Media”) or think about the unborn (“Prenatal DNA Sequencing”). Some are brilliant feats of engineering (“Baxter”). Others stem from attempts to rethink longstanding problems in their fields (“Deep Learning” and “Ultra-Efficient Solar Power”). As a whole, we intend this annual list not only to tell you which technologies you need to know about, but also to celebrate the creativity that produced them.
If this is the case then it is clear that the 750 billion ton level (which is another 250 billion tons and we are adding about 40 billion tons each year) will be breached. Humanity would then need to use technology to extract CO2 instead of relying upon natural environmental mechanisms to deal with the CO2.
Even if carbon dioxide emissions came to a sudden halt, the carbon dioxide already in Earth's atmosphere could continue to warm our planet for hundreds of years, according to Princeton University-led research published in the journal Nature Climate Change. The study suggests that it might take a lot less carbon than previously thought to reach the global temperature scientists deem unsafe.
The Intergovernmental Panel on Climate Change estimates that global temperatures a mere 2 degrees Celsius (3.6 degrees Fahrenheit) higher than pre-industrial levels would dangerously interfere with the climate system. To avoid that point would mean humans have to keep cumulative carbon dioxide emissions below 1,000 billion tons of carbon, about half of which has already been put into the atmosphere since the dawn of industry.
The lingering warming effect the researchers found, however, suggests that the 2-degree point may be reached with much less carbon, said first author Thomas Frölicher, who conducted the work as a postdoctoral researcher in Princeton's Program in Atmospheric and Oceanic Sciences under co-author Jorge Sarmiento, the George J. Magee Professor of Geoscience and Geological Engineering.
"If our results are correct, the total carbon emissions required to stay below 2 degrees of warming would have to be three-quarters of previous estimates, only 750 billion tons instead of 1,000 billion tons of carbon," said Frölicher, now a researcher at the Swiss Federal Institute of Technology in Zurich. "Thus, limiting the warming to 2 degrees would require keeping future cumulative carbon emissions below 250 billion tons, only half of the already emitted amount of 500 billion tons.”
The researchers' work contradicts a scientific consensus that the global temperature would remain constant or decline if emissions were suddenly cut to zero. But previous research did not account for a gradual reduction in the oceans' ability to absorb heat from the atmosphere, particularly the polar oceans, Frölicher said. Although carbon dioxide steadily dissipates, Frölicher and his co-authors were able to see that the oceans that remove heat from the atmosphere gradually take up less. Eventually, the residual heat offsets the cooling that occurred due to dwindling amounts of carbon dioxide.
When cloud formations take physical shape, neither their scale nor duration has an upper bound: We may begin to see cloud towns, then cloud cities, and ultimately cloud countries. At first this sounds rather implausible.
..The concept of migrating our lives to the cloud is much more than a picturesque metaphor, and actually amenable to quantitative study. Though the separation between our bodies is still best characterized by the geographical distance between points on the surface of the earth, the distance between our minds is increasingly characterized by a completely different metric: the geodesic distance, the number of degrees of separation between two nodes in a social network. Importantly, this geodesic distance is just as valid a mathematical metric as the geographical. In fact, there are entire conferences devoted to cloud cartography, in which research groups from Stanford to Carnegie Mellon to MIT present the first maps of online social networks — mapping not nation states but states of mind.
In 1993, after five years of grad school and low-wage postdoctoral research, Michael Kremer got a job as a professor of economics at MIT. With his new salary, he finally had enough money to fund a long-held desire: to return to Kenya’s Western Province, where he had lived for a year after college, teaching in a rural farming community. He wanted to see the place again, reconnect with his host family and other friends he’d made there.
When he arrived the next summer, he found out that one of those friends had begun working for an education nonprofit called ICS Africa. At the time, there was a campaign, spearheaded by the World Bank, to provide free textbooks throughout sub-Saharan Africa, on the assumption that this would boost test scores and keep children in school longer. ICS had tasked Kremer’s friend with identifying target schools for such a giveaway.
While chatting with his friend about this, Kremer began to wonder: How did ICS know the campaign would work? It made sense in theory—free textbooks should mean more kids read them, so more kids learn from them—but they had no evidence to back that up. On the spot, Kremer suggested a rigorous way to evaluate the program: Identify twice the number of qualifying schools as it had the money to support. Then randomly pick half of those schools to receive the textbooks, while the rest got none. By comparing outcomes between the two cohorts, they could gauge whether the textbooks were making a difference.
Imagine living on a bustling city block, but free from the noise of car horns and people on the street. The emerging field of phononics could one day make this a reality.
The phonon, like the photon or electron, is a physical particle that travels like waves, representing mechanical vibration. Phonons transmit everyday sound and heat. Recent progress in phononics has led to the development of new ideas and devices that are using phononic properties to control sound and heat, according to a new review in Nature.
Two general misconceptions about overpopulation have achieved broad circulation. The first is that it is, as authors from Stephen Hawking to Dan Brown have written, exponential. It is not. This was a central assumption of Thomas Malthus’ “An Essay on the Principle of Population,” in which he argued that an asymmetry between population growth and food production would lead to global famine. In fact, global birthrates are in a long decline. In 1970 the average woman had nearly six children; today that number has dropped below three.The second common misconception is that rapid population growth is distributed more or less evenly across the globe. In reality, more than half of the world’s citizens now reproduce at below the replacement rate. In many countries, including the United States and India, the birthrate only just exceeds the replacement rate. Population growth remains truly high only in the developing world, mostly in Africa, the Middle East, and Latin America. Projections indicate that these regions will hold the vast majority of the world’s additional people as the century progresses. What they have in common, aside from high birthrates, is poverty.
Evolution does not operate with a goal in mind; it does not have foresight. But organisms that have a greater capacity to evolve may fare better in rapidly changing environments. This raises the question: does evolution favor characteristics that increase a species' ability to evolve?
For several years, biologists have attempted to provide evidence that natural selection has acted on evolvability. Now a new paper by University of Pennsylvania researchers offers, for the first time, clear evidence that the answer is yes.
The senior author on the study, published in the journal PLOS Pathogens, is Dustin Brisson, an assistant professor in the School of Arts and Sciences' Department of Biology. His coauthors include Penn's Christopher J. Graves, Vera I. D. Ros and Paul D. Sniegowski, and the University of Kentucky's Brian Stevenson.
"It's not controversial that populations evolve and that some traits are more apt to evolve than others," Brisson said. "What we were asking is whether the ability of an organism to evolve is a trait that natural selection can pick."
The simplest welfare program imaginable: an income for everyone, no strings attached.
This fall, a truck dumped eight million coins outside the Parliament building in Bern, one for every Swiss citizen. It was a publicity stunt for advocates of an audacious social policy that just might become reality in the tiny, rich country. Along with the coins, activists delivered 125,000 signatures — enough to trigger a Swiss public referendum, this time on providing a monthly income to every citizen, no strings attached. Every month, every Swiss person would receive a check from the government, no matter how rich or poor, how hardworking or lazy, how old or young. Poverty would disappear. Economists, needless to say, are sharply divided on what would reappear in its place — and whether such a basic-income scheme might have some appeal for other, less socialist countries too.
Bill Gates on how innovation is the key to a brighter future, and how we're only just getting started.
Technology is unlocking the innate compassion we have for our fellow human beings. In the end, that combination—the advances of science together with our emerging global conscience—may be the most powerful tool we have for improving the world.
Google's Motorola Mobility division has filed an application with the US Patent and Trademark Office for a "system and method" to tattoo a mobile-device microphone with lie-detector circuitry onto your throat.
Your immediate response, dear reader, was ours as well: What the...? With such a device tattooed on one's throat, it would make it rather painful to, say, switch carriers, eh? And with a lie detector permanently attached, NSA snoops could have a field day.
It wasn't until we had read at least halfway through the 10 pages of US Patent Application No. 20130297301, "Coupling an Electronic Skin Tattoo to a Mobile Device" that we encountered the words "flexible substrate".
After our fears were so allayed, we were able to more objectively evaluate the application, filed on May 3, 2012, and published this Thursday during the USPTO's weekly patent-fest. In sum, some of the filing's ideas seem reasonable, and others risible.
We'll begin with the reasonable, then move on from there.
Silicon Valley hosts lavish ceremony for Breakthrough prize that aims to give scientists celebrity status and inspire interest in life's 'big questions-
Silicon Valley has a tendency to tackle social ills with big ideas, its feisty startups revolutionising everything from healthcare to education. Now a handful of billionaire engineers have turned their attention to a social blight that affects their own kind: the lack of appreciation (and funding) for scientists.
The second Breakthrough prize for life sciences is being awarded on Thursday at Nasa's Ames Research Centre in Mountain View, California, about a five-minute drive from the headquarters of Google. It will probably be the most lavish awards ceremony that its six winning scientists have ever been to.
Actor Kevin Spacey is expected to host, while comedian Conan O'Brien and actor Glenn Close will make presentations. Organisers hope for the whole event to be televised, and the six prizes will be worth $3m (£1.83m) – each. This is among the most lucrative awards in science, almost triple the size of the Nobel prize, and bigger than the $1.7m Templeton prize. It's expected to be bigger and bolder than the last similar ceremony, held on 20 March 2013 in Geneva, Switzerland, where Morgan Freeman hosted and Sarah Brightman sang.
It's like the "Oscars of science", says Yuri Milner, the wealthy technology investor behind it all. Once upon a time Milner was himself a physics major at Moscow State University, but he ended up dropping out and embracing the world of tech investing instead. In 2009 he steered one of the first big funding rounds in Facebook, before putting money behind Groupon, Zynga, Twitter and Airbnb and picking up a host of new contacts and street cred in Silicon Valley. In 2011 Forbes magazine valued him as a billionaire.
The selfish gene is one of the most successful science metaphors ever invented. Unfortunately it’s wrong
A couple of years ago, at a massive conference of neuroscientists — 35,000 attendees, scores of sessions going at any given time — I wandered into a talk that I thought would be about consciousness but proved (wrong room) to be about grasshoppers and locusts. At the front of the room, a bug-obsessed neuroscientist named Steve Rogers was describing these two creatures — one elegant, modest, and well-mannered, the other a soccer hooligan.
The grasshopper, he noted, sports long legs and wings, walks low and slow, and dines discreetly in solitude. The locust scurries hurriedly and hoggishly on short, crooked legs and joins hungrily with others to form swarms that darken the sky and descend to chew the farmer’s fields bare.
Related, yes, just as grasshoppers and crickets are. But even someone as insect-ignorant as I could see that the hopper and the locust were wildly different animals — different species, doubtless, possibly different genera. So I was quite amazed when Rogers told us that grasshopper and locust are in fact the same species, even the same animal, and that, as Jekyll is Hyde, one can morph into the other at alarmingly short notice.
(Phys.org) —Given all the weird things that can occur in quantum mechanics—from entanglement to superposition to teleportation—not much seems surprising in the quantum world. Nevertheless, a new finding that an object's physical properties can be disembodied from the object itself is not something we're used to seeing on an everyday basis. In a new paper, physicists have theoretically shown that this phenomenon, which they call a quantum Cheshire Cat, is an inherent feature of quantum mechanics and could prove useful for performing precise quantum measurements by removing unwanted properties.
The physicists, Yakir Aharonov at Tel Aviv University in Tel Aviv, Israel, and Chapman University in Orange, California, US, and his coauthors have published a paper on quantum Cheshire Cats in a recent issue of the New Journal of Physics.
The physicists begin their paper with an excerpt from Lewis Carroll's 1865 novel Alice in Wonderland:
'All right', said the Cat; and this time it vanished quite slowly, beginning with the end of the tail, and ending with the grin, which remained some time after the rest of it had gone.
'Well! I've often seen a cat without a grin', thought Alice, 'but a grin without a cat! It's the most curious thing I ever saw in my life!'
Just as the grin is a property of a cat, polarization is a property of a photon. In their paper, the physicists explain how, "in the curious way of quantum mechanics, photon polarization may exist where there is no photon at all."
Bitcoin is a digital currency, meaning it's money controlled and stored entirely by computers spread across the internet, and this money is finding its way to more and more people and businesses around the world.
The price of a bitcoin topped $900 last week, an enormous surge in value that arrived amidst Congressional hearings where top U.S. financial regulators took a surprisingly rosy view of digital currency. Just 10 months ago, a bitcoin sold for a measly $13.
The spike was big news across the globe, from Washington to Tokyo to China, and it left many asking themselves: “What the hell is a bitcoin?” It’s a good question — not only for those with little understanding of the modern financial system and how it intersects with modern technology, but also for those steeped in the new internet-driven economy that has so quickly remade our world over the last 20 years.
Bitcoin is a digital currency, meaning it’s money controlled and stored entirely by computers spread across the internet, and this money is finding its way to more and more people and businesses around the world. But it’s much more than that, and many people — including the sharpest of internet pioneers as well as seasoned economists — are still struggling to come to terms with its many identities.
With that in mind, we give you this: an idiot’s guide to bitcoin. And there’s no shame in reading. Nowadays, as bitcoin is just beginning to show what it’s capable of, we’re all neophytes.
Bitcoin isn’t just a currency, like dollars or euros or yen. It’s a way of making payments, like PayPal or the Visa credit card network. It lets you hold money, but it also lets you spend it and trade it and move it from place to place, almost as cheaply and easily as you’d send an email.
And why their "bad" decisions might be more rational than you'd think.
In August, Science published a landmark study concluding that poverty, itself, hurts our ability to make decisions about school, finances, and life, imposing a mental burden similar to losing 13 IQ points.
It was widely seen as a counter-argument to claims that poor people are "to blame" for bad decisions and a rebuke to policies that withhold money from the poorest families unless they behave in a certain way. After all, if being poor leads to bad decision-making (as opposed to the other way around), then giving cash should alleviate the cognitive burdens of poverty, all on its own.
Sometimes, science doesn't stick without a proper anecdote, and "Why I Make Terrible Decisions," a comment published on Gawker's Kinja platform by a person in poverty, is a devastating illustration of the Science study. I've bolded what I found the most moving, insightful portions, but it's a moving and insightful testimony all the way through.
Transhumanism is the idea that we can (and should) use scientific knowledge to fundamentally transform the human form in order to improve human intellectual and physical abilities. These ideas are usually connected with discussion about radical life extension, or even dreams of some form of immortality. I’ve previously discussed how companies like Google are even getting involved in the transhumanist desire for radical life extension. The cultural pressure for longer life seems to be growing quickly. But can we actually live forever (or at least radically longer)? The answer to that questions is (sadly) unknown, but I wanted to make a video that contextualized our seemingly innate human drive to seek immortality or eternal youth.
Feeling nostalgic about the past can increase optimism about the future, a new study shows
The research, published in the Personality and Social Psychology Bulletin, examines the idea that nostalgia is not simply a past-orientated emotion and that its scope extends into the future, with a positive outlook.
“Nostalgia is experienced frequently and virtually by everyone and we know that it can maintain psychological comfort,” says study co-author Tim Wildschut. “For example, nostalgic reverie can combat loneliness. We wanted to take that a step further and assess whether it can increase a feeling of optimism about the future.”
The secret to why some cultures thrive and others disappear may lie in our social networks and our ability to imitate, rather than our individual smarts, according to a new University of British Columbia study.
The study, published in the Proceedings of the Royal Academy: Biological Sciences (open access), shows that when people can observe and learn from a wider range of teachers, groups can better maintain technical skills and even increase the group’s average skill over successive generations.
The findings show that a larger population size and social connectedness are crucial for the development of more sophisticated technologies and cultural knowledge, says lead author Michael Muthukrishna, a PhD student in UBC’s Dept. of Psychology.
“This is the first study to demonstrate in a laboratory setting what archeologists and evolutionary theorists have long suggested: that there is an important link between a society’s sociality and the sophistication of its technology,” says Muthukrishna, who co-authored the research with UBC Prof. Joseph Henrich.
Teens who go to bed late are more prone to academic and emotional difficulties in the long run, compared to teens who turn in early, a new study shows.
Adolescents who go to bed late during the school year are more prone to academic and emotional difficulties in the long run, compared to teens who turn in early.
Researchers analyzed longitudinal data from a nationally representative cohort of 2,700 US adolescents of whom 30 percent reported bedtimes later than 11:30 p.m. on school days and 1:30 a.m. in the summer in their middle and high school years.
By the time they graduated from high school, the school-year night owls had lower GPA scores, and were more vulnerable to emotional problems than teens with earlier bedtimes, according to the study published in the Journal of Adolescent Health.
The results present a compelling argument in favor of later middle and high school start times in the face of intense academic, social and technological pressures, researchers says.
You want a world-class conversation about the future of global health, the vanguard of philanthropy, and the divide between ignorance and data-driven knowledge? Bring in the Bills. Gates and Clinton, that is.