Giant academic social networks have taken off to a degree that no one expected even a few years ago. A Nature survey explores why.
Via Neelima Sinha
Get Started for FREE
Sign up with Facebook Sign up with X
I don't have a Facebook or a X account
Your new post is loading...
R Schumacher & Associates LLC's curator insight,
January 15, 2014 1:43 PM
The monikers such as "deep learning" may be new, but Artificial Intelligence has always been the Holy Grail of computer science. The applications are many, and the path is becoming less of an uphill climb.
luiy's curator insight,
February 26, 2014 6:19 AM
Deep learning itself is a revival of an even older idea for computing: neural networks. These systems, loosely inspired by the densely interconnected neurons of the brain, mimic human learning by changing the strength of simulated neural connections on the basis of experience. Google Brain, with about 1 million simulated neurons and 1 billion simulated connections, was ten times larger than any deep neural network before it. Project founder Andrew Ng, now director of the Artificial Intelligence Laboratory at Stanford University in California, has gone on to make deep-learning systems ten times larger again.
Such advances make for exciting times in artificial intelligence (AI) — the often-frustrating attempt to get computers to think like humans. In the past few years, companies such as Google, Apple and IBM have been aggressively snapping up start-up companies and researchers with deep-learning expertise. For everyday consumers, the results include software better able to sort through photos, understand spoken commands and translate text from foreign languages. For scientists and industry, deep-learning computers can search for potential drug candidates, map real neural networks in the brain or predict the functions of proteins. |
Andreas Pappas's curator insight,
March 28, 2014 4:40 AM
This article shows how scientists can increase the scale of quantum machine while still making them behave quantum mechanically by reading the qu-bits with lasers instead of conventional wiring.
VendorFit's curator insight,
December 31, 2013 3:27 PM
Artificial intelligence is the holy grail of technological achievment, creating an entity that can learn from its own mistakes and can (independently of programmer intervention) develop new routines and programs. The New York Times claims that the first ever "learning" computer chip is to be released in 2014, an innovation that has profound consequences for the tech market. When these devices become cheaper, this should allow for robotics and device manufacture that incorporates more detailed sensory input and can parse real objects, like faces, from background noise. |