Neuromorphic comp...
Follow
Find
1.0K views | +2 today
Neuromorphic computing
Your new post is loading...
Your new post is loading...
Scooped by Alin Velea
Scoop.it!

The biggest upgrade in Intel's smallest chips is almost here

The biggest upgrade in Intel's smallest chips is almost here | Neuromorphic computing | Scoop.it

The chip family Intel believes will give it the edge in future ultraportables, tablets, and 2-in-1 hybrids, not to mention the Internet of Things, will show up in systems by the holidays, as 14nm production kicks off. Intel Core M, codenamed Broadwell, will be the first chip to use the new production process, which promises better battery life but less heat.

In fact, versus the previous-generation of chips, the 14nm silicon hits a thermal design point less than half of what came before, even with performance roughly the same. Although the first applications are likely to be mobile devices, Intel also has plans for 14nm chips in Xeon servers as well as more mainstream devices with Core i3, Core i5, and Core i7 processors.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Brain-Inspired Computing Reaches a New Milestone

Brain-Inspired Computing Reaches a New Milestone | Neuromorphic computing | Scoop.it

For the past few years, tech companies and academic researchers have been trying to build so-called neuromorphic computer architectures—chips that mimic the human brain’s ability to be both analytical and intuitive in order to deliver context and meaning to large amounts of data. Now the leading effort to develop such a systemhas achieved a new milestone, producing a 5.4-billion transistor chip with more than 4,000 neurosynaptic cores.

Each core consists of computing components analogous to their biological counterparts—core memory functions similar to the brain’s synapses, processors that provide the core’s nerve cells (or neurons), and communication capabilities handled by wiring akin to the brain’s axon nerve fibers. The IBM and Cornell University researchers heading this project published their results in the August 8 edition of Science.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

IBM cracks open a new era of computing with brain-like chip: 4096 cores, 1 million neurons, 5.4 billion transistors

IBM cracks open a new era of computing with brain-like chip: 4096 cores, 1 million neurons, 5.4 billion transistors | Neuromorphic computing | Scoop.it

Scientists at IBM Research have created by far the most advanced neuromorphic (brain-like) computer chip to date. The chip, called TrueNorth, consists of 1 million programmable neurons and 256 million programmable synapses across 4096 individual neurosynaptic cores. Built on Samsung’s 28nm process and with a monstrous transistor count of 5.4 billion, this is one of the largest and most advanced computer chips ever made. Perhaps most importantly, though, TrueNorth is incredibly efficient: The chip consumes just 72 milliwatts at max load, which equates to around 400 billion synaptic operations per second per watt — or about 176,000 times more efficient than a modern CPU running the same brain-like workload, or 769 times more efficient than other state-of-the-art neuromorphic approaches. Yes, IBM is now a big step closer to building a brain on a chip.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Atomic Switch Networks for Cognitive Technology

Atomic Switch Networks for Cognitive Technology | Neuromorphic computing | Scoop.it

While modern computers have revolutionized information processing, the mammalian brain continues to reign supreme in tasks such as recognizing sounds or objects, reading handwriting, or predicting where food may be found based on both memory and environmental clues. 
This contrast in performance stems from the radically divergent physical structures and operating mechanisms of neuronal networks and digital circuits. Computers employ a microprocessor to rapidly perform simple, error-free calculations in a sequential fashion and store data in physically separate memory banks. In contrast, the brain comprises a vast network of neurons serving simultaneously as both information processors and memory units, resulting in comparatively slow and imprecise operations in a parallel or distributed manner.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Data is Growing Faster Than Computing Power

Data is Growing Faster Than Computing Power | Neuromorphic computing | Scoop.it

There is an inconvenient truth in technology: The amount of data keeps growing exponentially, while the increases in the power of computers are slowing down.

Somebody needs to invent a new way for computers to work—different from the programmable electronic machines that have dominated since the 1950s. Otherwise, we’ll either get overrun by data we can’t use or we’ll end up building data centers the size of Rhode Island that suck up so much electricity they’ll need their own nuclear power plants.

And, seriously, does the world want Google to have nuclear capabilities?

Click here to edit the title

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Rise of the cyborgs

Rise of the cyborgs | Neuromorphic computing | Scoop.it

Of all the powers that we have imagined for the cyborg, which do we most covet? Their ability to see and sense detail in the environment? The ability manipulate things with the dexterity and power of a machine? Or perhaps it would be to command vast amounts of information which can be processed at tremendous speed?

If you chose none of those, you chose as any cyborg likely would have. The cyborg’s greatest power, that from which it derives the most satisfaction (to use that term loosely), must be the ability to see itself. As humans, we are a mystery unto ourselves. If we were suddenly presented with one of our own organs from beneath our skin, before the panic set in, we would be taken by the awe and mystery that a mother must feel after the delivery of her child. To know the mass inside our skull will be to know ourselves — and to control what we might become.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Will it ever be possible to compute the human brain?

Will it ever be possible to compute the human brain? | Neuromorphic computing | Scoop.it

In the video below, Tampa Bay Rays third baseman Evan Longoria is seen to make a spectacular grab to save a reporter from certain death — or at least serious injury. Granted, Evan may have had a little help from video editing, but at the professional level at least, comparable performances no doubt occur every time an umpire gives the command to play ball. The computations a man-made machine would need to perform to detect and track an incoming threat, like an errant ball, and simultaneously perform motor adjustments to intercept it are certainly not trivial. Yet, for a human brain, the computations underlying such virtuosity pale in comparison to the massive background processing interleaved to create the awareness to perform the task in the first place — or to chose a different course of action on say, the tenth run of the scenario.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Are synchronized ‘wacky oxide’ chips the key to brain-like computers?

Are synchronized ‘wacky oxide’ chips the key to brain-like computers? | Neuromorphic computing | Scoop.it

Researchers at Penn State have created a new kind of computer chip that could be the key to creating neuromorphic (brain-like) computers that can solve incredibly complex problems while consuming just 1% of the power of current chips. These new chips contain a special “wacky oxide” (that’s the scientific term) material that oscillates at certain frequency, and synchronizes with other nearby wacky chips — much in the same way that nearby neurons often fire in synchrony.

As you probably know, all modern computers are based on Boolean logic — a strict set of logical rules (AND, OR, NOT) that always result in a true or false answer. In a modern processor, every functional unit is a collection of Boolean logic gates made out of transistors. A transistor (and thus the logic gate) can either be on or off — true or false — and no state in between. So far, as you can see from all the computers and digital interfaces that surround us, binary and Boolean computing has been rather successful.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

The human brain’s remarkably low power consumption, and how computers might mimic its efficiency

The human brain’s remarkably low power consumption, and how computers might mimic its efficiency | Neuromorphic computing | Scoop.it

A new paper from researchers working in the UK and Germany dives into how much power the human brain consumes when performing various tasks — and sheds light on how humans might one day build similar computer-based artificial intelligences. Mapping biological systems isn’t as sexy as the giant discoveries that propel new products or capabilities, but that’s because it’s the final discovery — not the decades of painstaking work that lays the groundwork — that tends to receive all the media attention.

This paper — Power Consumption During Neuronal Computation — will run in an upcoming issue of IEEE’s magazine, “Engineering Intelligent Electronic Systems Based on Computational Neuroscience.” Here at ET, we’ve discussed the brain’s computational efficiency on more than one occasion. Put succinctly, the brain is more power efficient than our best supercomputers by orders of magnitude — and understanding its structure and function is absolutely vital.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

HP Labs’ memristor can turn RAM into SSDs

HP Labs’ memristor can turn RAM into SSDs | Neuromorphic computing | Scoop.it

HP has what it calls "The Machine", practically a researcher's plaything for experimenting on emerging computer technologies. One such technology that is already quite close to becoming a reality is HP's "memristor", a portmanteau of "memory" and "resistor" that could forever blur the boundaries between non-volatile disk storage and and volatile RAM.

The concept of a memristor is hardly new but largely unknown to those outside this particular field of computer science and engineering. Conceived in 1971 by Professor Leon Chua of UCB, a memristor is like a resistor except that the amount of resistance it uses on a current depends on the intensity and direction of the current that passed through it. To keep it short, it means that a memristor's resistance is basically "writable", making it more akin to computer memory, RAM or otherwise. But just as important is the fact that the memristor can actually hold on to that current even without power, unlike RAM and more like disk storage. It wasn't until 2008 that the first though somewhat contentious real-world implementation was made by HP Labs senior fellow R. Stanley Williams.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

HP plans to launch memristor, silicon photonic computer within the decade

HP plans to launch memristor, silicon photonic computer within the decade | Neuromorphic computing | Scoop.it

In 2008, scientists at HP invented a fourth fundamental component to join the resistor, capacitor, and inductor: the memristor. Theorized back in 1971, memristors showed promise in computing as they can be used to both build logic gates, the building blocks of processors, and also act as long-term storage.

At its HP Discover conference in Las Vegas today, HP announced an ambitious plan to use memristors to build a system, called simply "The Machine," shipping as soon as the end of the decade. By 2016, the company plans to have memristor-based DIMMs, which will combine the high storage densities of hard disks with the high performance of traditional DRAM.

John Sontag, vice president of HP Systems Research, said that The Machine would use "electrons for processing, photons for communication, and ions for storage." The electrons are found in conventional silicon processors, and the ions are found in the memristors. The photons are because the company wants to use optical interconnects in the system, built using silicon photonics technology. With silicon photonics, photons are generated on, and travel through, "circuits" etched onto silicon chips, enabling conventional chip manufacturing to construct optical parts. This allows the parts of the system using photons to be tightly integrated with the parts using electrons.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

New Microchip Can Mimic How a Human Brain Thinks

New Microchip Can Mimic How a Human Brain Thinks | Neuromorphic computing | Scoop.it

Researchers from the University of Zurich, have created neuromorphic chips that can mimic the way a human brain will process information in real-time.

With the assistance of an artificial sensory processing system, these chips are able to display cognitive abilities.

Giacomo Indiveri, professor at the Institute of Neuroinformatics (INI), of the University of Zurich and ETH Zurich, explained that the goal of the team was to “emulate the properties of biological neurons and synapses directly on microchips.”

With the creation of artificial neuromorphic neurons that can perform specified tasks, the researchers are able to further advancement toward a complex sensorimotor that can complete tasks in real-time.

Shockingly, behavior can be replicated by input formulated in a finite-state machine that could be transferred into neuromorphic hardware.

Indiveri stated: “The network connectivity patterns closely resemble structures that are also found in mammalian brains.”

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Neuromorphic chips could help reverse-engineer the human brain

Neuromorphic chips could help reverse-engineer the human brain | Neuromorphic computing | Scoop.it

Researchers at the University of Zurich and ETH Zurich have designed a sophisticated computer system that is comparable in size, speed and energy consumption to the human brain. Based on the development of neuromorphic microchips that mimic the properties of biological neurons, the research is seen as an important step in understanding how the human brain processes information and opens the door to fast, extremely low-power electronic systems that can assimilate sensory input and perform user-defined tasks in real time.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

IBM SyNAPSE Chip: 5 Fast Facts You Need to Know

IBM SyNAPSE Chip: 5 Fast Facts You Need to Know | Neuromorphic computing | Scoop.it

SyNAPSE Stands for “Systems of Neuromorphic Adaptive Plastic Scalable Electronics.” According to a report on CNN Money, the chip functions like a human brain when paired with other chips of the same caliber. The chip is able to sense, taste, feel, smell, and hear its surroundings.

Each chip is equivalent to the intelligence of one bee, with a chip containing over 4,000 cores, 1 million “neurons” and 256 million “synapses.” When the chips start working together, they can become quite powerful.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Brain-inspired Chip

Brain-inspired Chip | Neuromorphic computing | Scoop.it

Six years ago, IBM and our university partners embarked on a quest—to build a brain-inspired machine—that at the time appeared impossible. Today, in an article published in Science, we deliver on the DARPA SyNAPSE metric of a one million neuron brain-inspired processor. The chip consumes merely 70 milliwatts, and is capable of 46 billion synaptic operations per second, per watt–literally a synaptic supercomputer in your palm.

Along the way—progressing through Phase 0, Phase 1,Phase 2, and Phase 3—we have journeyed fromneuroscience to supercomputing, to a new computer architecture, to a new programming language, toalgorithms, applications, and now to a new chip—TrueNorth.

Let me take this opportunity to take you through the road untraveled. At this moment, I hope this reflection will incite within you a burning desire to collaborate and partner with us to make the future journey a joint one.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Francesco Pavone: Brain research aims to advance neuromorphic computing

Francesco Pavone: Brain research aims to advance neuromorphic computing | Neuromorphic computing | Scoop.it

Francesco Pavone is director and professor at the European Laboratory for Nonlinear Spectroscopy (LENS) in Florence, Italy. He directs a research group working in the field of biophotonics on single-molecule biophysics, microscopy imaging-spectroscopy techniques, biomedical imaging, and laser manipulation of bio-samples. In particular, he is developing new microscopy techniques for high-resolution and high-sensitivity imaging, and for laser manipulation. These techniques have been applied for single-molecule biophysics, single-cell imaging, and optical manipulation. Tissue imaging is another research area where nonlinear optical techniques have been applied for skin and neural-tissue imaging. Recently, in-vivo imaging apparatus has been developed and applied to animals and humans.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

4-4-2 becomes 0101: inside the competitive world of robot football

4-4-2 becomes 0101: inside the competitive world of robot football | Neuromorphic computing | Scoop.it

The whistle has just been blown on one of the most thrilling events on the international sporting calendar. It took place in Brazil and pitted teams from all over the world against each other, each hoping to make it into the history books. But no managers were fired, no grass had to be watered and certainly no one got bitten.

The event was the Robocup, a tournament that sees professional footballers replaced by robots. It’s one of a number of regular tournaments for teams of programmers and robotics experts to show off their latest work.

The Robocup standard platform league matches play out on a much smaller scale than your average World Cup match. An arena of around 6 metres by 9 metres is marked out as a miniature pitch and 10, rather than 22 players file on to battle it out. The players are NAO robots, state of the art bipedal humanoid robots which stand about 60cm tall.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Japan Wants 2020 Robot Olympics Alongside Human Olympics

Japan Wants 2020 Robot Olympics Alongside Human Olympics | Neuromorphic computing | Scoop.it

"In 2020 I would like to gather all of the world's robots and aim to hold an Olympics where they compete in technical skills," said Japanese Prime Minister Shinzo Abe last week.

It's about time!

There have been, and are, all kinds of competitive robotics events that take place all over the world. We were huge fans of RoboGames, FIRST, andRoboCup (which is taking place right now in Brazil). And there's Robo-One, RoboCup@Home, Sparkfun's Autonomous Vehicle Competition, along with any number of research-based competitions that take place atICRA and IROS. All of these events are fantastic, but a flagship event like a worldwide robot olympics would be something special.

Competition spurs innovation. You don't have to look any farther than theDARPA Robotics Challenge (or the earlier DARPA Grand Challenge for autonomous vehicles) to see how much of an impact these kinds of events can have on the speed and focus of technological advancement. We certainly don't mean to suggest that smaller competitions (namely, those without an Olympic-scale backing) aren't relevant or important, because they absolutely absolutely absolutely are. But there are things that can only be accomplished when you have a lot of resources to throw around, as DARPA has demonstrated.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

EU spends $1.4 billion on flawed, but exciting, brain-in-a-box project

EU spends $1.4 billion on flawed, but exciting, brain-in-a-box project | Neuromorphic computing | Scoop.it

Big physics has been showered in riches since the World War II era, yet today finds itself in a bit of a crisis as traditional funding priorities are increasingly questioned. While projects like the Large Hadron Collider (LHC) have enjoyed funding closer to $10 billion, projects in the life sciences which would offer tangible benefit to more people have struggled for survival. A man we will be hearing a bit more about in the future, Henry Markram, has just sold the European Union a brain in a box for €1 billion ($1.4 billion). When his ticket comes due in 10 years, there is one thing that can be counted on — someone is going to have some explaining to do.

In a major announcement this week, Europe has funded two new projects, promising over $3 billion in total. One study will be focused on new applications for graphene, while the other, now being described as the Human Brain Project (HBP), seeks nothing less than a simulation of the human brain.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

IBM creates Corelet programming language to make software that operates like the human brain

IBM creates Corelet programming language to make software that operates like the human brain | Neuromorphic computing | Scoop.it

At the International Joint Conference on Neural Networks held this week in Dallas, researchers from IBM have taken the wraps off a new software front-end for its neuromorphic processor chips. The ultimate goal of these most recent efforts is to recast Watson-style cognitive computing, and its recent successes, into a decidedly more efficient architecture inspired by the brain. As we shall see, the researchers have their work cut out for them — building something that on the surface looks like the brain is a lot different from building something that acts like the brain.

Head researcher of IBM’s Cognitive Computing group, Dharmendra Modha, announced last November that his group had simulated over 500 billion neurons using the Blue Gene/Sequoia supercomputer at the Lawrence Livermore National Laboratory (LLNL). His claims, however, continue to draw criticism from others who say that the representation of these neurons is too simplistic. In other words, the model neurons generate spikes like real neurons, but the underlying activity that creates those spikes is not modeled in sufficient detail, nor are the details of connections between them.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Sandia National Labs is stepping up its neuro-inspired computer chip research

Sandia National Labs is stepping up its neuro-inspired computer chip research | Neuromorphic computing | Scoop.it

Human ingenuity has given birth to incredibly powerful computers that can plow through more calculations in a second than most people could in their entire lives, but computers still aren’t terribly adaptable. The human brain is a very different kind of computer — a massively parallel processor that has been shaped by millions of years of evolution to recognize patterns and adjust to changing situations. This is the kind of capability computer science researchers are now trying to unlock, and scientists at Sandia National Laboratories are stepping up their game to design neuro-inspired, orneuromorphic, computer systems.

Sandia isn’t just attracted to the idea of computers designed like brains because of the capabilities, but the human brain is also incredibly efficient. A computer has trouble telling the difference between a picture of a dog and a cat, but it eats up hundreds of watts of power simply trying. A brain, by contrast, operates continuously for decades and only consumes roughly the same power as a 20-watt light bulb.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Neural networks that function like the human visual cortex may help realize faster, more reliable pattern recognition

Neural networks that function like the human visual cortex may help realize faster, more reliable pattern recognition | Neuromorphic computing | Scoop.it

Despite decades of research, scientists have yet to create an artificial neural network capable of rivaling the speed and accuracy of the human visual cortex. Now, Haizhou Li and Huajin Tang at the A*STAR Institute for Infocomm Research and co-workers in Singapore propose using a spiking neural network (SNN) to solve real-world pattern recognition problems. Artificial neural networks capable of such pattern recognition could have broad applications in biometrics, data mining and image analysis.

Humans are remarkably good at deciphering handwritten text and spotting familiar faces in a crowd. This ability stems from the visual cortex—a dedicated area at the rear of the brain that is used to recognize patterns, such as letters, numbers and facial features. This area contains a complex network of neurons that work in parallel to encode visual information, learn spatiotemporal patterns and classify objects based on prior knowledge or statistical information extracted from patterns.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

HP starts a memristor-based space program to launch ... THE MACHINE

HP starts a memristor-based space program to launch ... THE MACHINE | Neuromorphic computing | Scoop.it

HP may have found a way to save itself from oblivion, but apparently the only way to be sure is to throw three quarters of its research team at an ambitious new product suite based on a much-hyped yet troublesome technology.

The beleaguered IT giant plans to rejuvenate itself with a set of advanced technologies that, when combined, make a device called "The Machine" that can be as small as a smartphone and as large as a 160-rack supercomputer, the company announced at its HP Discover event in Las Vegas on Wednesday.

This family of products will make use of HP's massively delayed "memristor" memory substrate, along with silicon photonics, a custom operating system, and customized chips, HP said.

If the plan works, HP may defeat the rot in its printing and computer divisions that is currently weighing it down. If it doesn't work, though, HP could become another Sun Microsystems, a company that birthed some incredible inventions but never quite figured out how to make money from them.

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Memristor - The Missing Circuit Element

Memristor - The Missing Circuit Element | Neuromorphic computing | Scoop.it

Most of us have heard of resistors, varistors, thermistors and so on….There are so many ‘ISTORS’ to bug our minds. Here’s one more its called a MEMRISTOR, anyway it’s a really cool thing.We all have heard of resistors which vary with many physical factors like temperature (anyway all resistors vary with temperature) , force ,pressure , voltage and so on.MEMRISTOR somehow stands apart from all of these. 

more...
No comment yet.
Scooped by Alin Velea
Scoop.it!

Neuromimetic processor board beats supercomputers at their own game

Neuromimetic processor board beats supercomputers at their own game | Neuromorphic computing | Scoop.it
Neuromorphic computing designs have yet to compete with traditional computing architectures, which continue to impress. Researchers have now developed a new computing platform known as Neurogrid, that runs around 100,000 times more efficiently. Each Neurogrid board, running at 5 watts, can simulate detailed neuronal activity of one million neurons -- and it can now do it in real time.
more...
No comment yet.