Brain-inspired chip fits 1m 'neurons' on postage stamp
Scientists have produced a new computer chip that mimics the organisation of the brain, and squeezed in one million computational units called "neurons".
They describe it as a supercomputer the size of a postage stamp.
Each neuron on the chip connects to 256 others, and together they can pick out the key features in a visual scene in real time, using very little power.
The design is the result of a long-running collaboration, led by IBM, and is published in the journal Science.
"The cumulative total is over 200 person-years of work," said Dr Dharmendra Modha, the publication's senior author.
He told BBC News the processor was "a new machine for a new era". But it will take some time for the chip, dubbed TrueNorth, to be commercially useful.Next generation
This is partly because programs need to be written from scratch to run on this type of chip, instead of on the traditional style which was conceived in the 1940s and still powers nearly all modern computers.
End Quote Sophie Wilson Senior Technical Director, Broadcom
Google Images... does a marvellous job of recognising pictures of cats - but it is using large arrays of computers”
That design, where the processors and memory are separate, is a natural match for sequential, mathematical operations.
However, the heavily interconnected structure of biologically-inspired, "neuromorphic" systems like TrueNorth is said to be a much more efficient way of handling a lot of data at the same time.
"Our chip integrates computation, communication and memory very closely," Dr Modha said.
Instead of binary ones and zeros, the units of computation here are spikes. When its inputs are active enough, one of TrueNorth's "neurons" generates a spike and sends it across the chip to other neurons, taking them closer to their own threshold.
Software has to be written completely differently for these spiking-network systems.
"It will be interesting to see those programs develop - but don't hold your breath," commented Sophie Wilson, an eminent computer engineer based in Cambridge.
Ms Wilson, a fellow of both the Royal Academy of Engineering and the Royal Society, can definitely see a role for this next generation of computing strategies.
"It's clear that conventional scalar processing is getting very tricky for some of these tasks," she told the BBC. "Google Images, for example, does a marvellous job of recognising pictures of cats - but it is using large arrays of computers to do that."Grid after grid
The building blocks for the TrueNorth chip are "neurosynaptic cores" of 256 neurons each, which IBM launched in 2011.
Dr Modha and his team managed to engineer an interconnected 64-by-64 grid of these cores on to a single chip, delivering over one million neurons in total.
Because each neuron is connected to 256 others, there are more than 256 million connections or "synapses".
This complexity is impressive for a man-made device just 3cm across, but still pales in comparison with the organ it emulates. Biological neurons, packed inside the brain, send and receive something in the order of 10,000 connections each.
The chip, Dr Modha is quick to point out, is "endlessly scalable". Multiple units can be plugged together to form another, still more powerful assembly.
"This isn't a 10-15% improvement," he said. "You're talking about orders and orders of magnitude."
To demonstrate TrueNorth's capabilities, Dr Modha's team programmed it to do a visual perception party trick.
Within a video filmed from a tower at Stanford University, a single chip analysed the moving images in real time and successfully identified which patches of pixels represented pedestrians, cyclists, cars, buses and trucks.
This is just the sort of task that the brain excels at, while traditional computers struggle.
End Quote Prof Steve Furber University of Manchester
This is another step in a programme, whose end point I suspect even they don't know at the moment”
Dr Modha envisages myriad next-generation applications, from glasses that help visually impaired people navigate, to robots for scouring the scene of a disaster.
But some of the gains might be overstated - or perhaps too eagerly anticipated.
Prof Steve Furber is a computer engineer at the University of Manchester who works on a similarly ambitious brain simulation project called SpiNNaker. That initiative uses a more flexible strategy, where the connections between neurons are not hard-wired.
He told BBC News that "time will tell" which strategy succeeds in different applications.
The new IBM chip was most significant, Prof Furber said, because of its sheer degree of interconnectedness. "I see it as continuing their programme of research - but it's an interesting and aggressive piece of integration," he said.
"This is another step in a programme, whose end point I suspect even they don't know at the moment."
Ms Wilson also pointed out that TrueNorth's efficiency, while it might trump a vast supercomputer, is not very far ahead of the latest small devices like smartphones and cameras, which are already engineered to minimise battery usage.
"Cellphone cameras can recognise faces," she said.
There is also a rival chip made by a company called Movidius, which Ms Wilson explained is not as adaptable (it is designed very specifically to process images) but uses even less power than TrueNorth.
That product, which we might see in devices as soon as next year, has also lifted elements of its computing strategy from the human brain.
Follow Jonathan on Twitter