Richie Bartlett

One step closer to SKYNET>>>

The potential: A computer that can not only see and hear, but understand what’s going on around it.
Allison Herreid | Shutterstock

Can a silicon chip act like a human brain? Researchers at IBM say they’ve built one that mimics the brain better than any that has come before it.

In a paper published in the journal Science today, IBM said it used conventional silicon manufacturing techniques to create what it calls a neurosynaptic processor that could rival a traditional supercomputer by handling highly complex computations while consuming no more power than that supplied by a typical hearing aid battery.

The chip is also one of the biggest ever built, boasting some 5.4 billion transistors, which is about a billion more than the number of transistors on an Intel Xeon chip.

To do this, researchers designed the chip with a mesh network of 4,096 neurosynaptic cores. Each core contains elements that handle computing, memory and communicating with other parts of the chip. Each core operates in parallel with the others.

Multiple chips can be connected together seamlessly, IBM says, and they could be used to create a neurosynaptic supercomputer. The company even went so far as to build one using 16 of the chips.

The new design could shake up the conventional approach to computing, which has been more or less unchanged since the 1940s and is known as the Von Neumann architecture. In English, a Von Neumann computer — you’re using one right now — stores the data for a program in memory.

This chip, which has been dubbed TrueNorth, relies on its network of neurons to detect and recognize patterns in much the same way the human brain does. If you’ve read your Ray Kurzweil, this is one way to understand how the brain works — recognizing patterns. Put simply, once your brain knows the patterns associated with different parts of letters, it can string them together in order to recognize words and sentences. If Kurzweil is correct, you’re doing this right now, using some 300 million pattern-recognizing circuits in your brain’s neocortex.

The chip would seem to represent a breakthrough in one of the long-term problems in computing: Computers are really good at doing math and reading words, but discerning and understanding meaning and context, or recognizing and classifying objects — things that are easy for humans — have been difficult for traditional computers. One way IBM tested the chip was to see if it could detect people, cars, trucks and buses in video footage and correctly recognize them. It worked.

In terms of complexity, the TrueNorth chip has a million neurons, which is about the same number as in the brain of a common honeybee. A typical human brain averages 100 billion. But given time, the technology could be used to build computers that can not only see and hear, but understand what is going on around them.

Currently, the chip is capable of 46 billion synaptic operations per second per watt, or SOPS. That’s a tricky apples-to-oranges comparison to a traditional supercomputer, where performance is measured in the number of floating point operations per second, or FLOPS. But the most energy-efficient supercomputer now running tops out at 4.5 billion FLOPS.

Down the road, the researchers say in their paper, they foresee TrueNorth-like chips being combined with traditional systems, each solving problems it is best suited to handle. But it also means that systems that in some ways will rival the capabilities of current supercomputers will fit into a machine the size of your smartphone, while consuming even less energy.

The project was funded with money from DARPA, the Department of Defense’s research organization. IBM collaborated with researchers at Cornell Tech and iniLabs.