Brains scale better than CPUs. So Intel is building brains

2 min read Original article ↗

However, neuromorphic hardware is proving able to handle tasks organic brains excel at much more efficiently than conventional processors or GPUs can. Visual object recognition is perhaps the most widely realized task where neural networks excel, but other examples include playing foosball, adding kinesthetic intelligence to prosthetic limbs, and even understanding skin touch in ways similar to how a human or animal might understand it.

This is a close-up shot of an Intel Nahuku board, each of which contains 8 to 32 Intel Loihi neuromorphic chips. Tim Herman/Intel Corporation

Loihi, the underlying chip Pohoiki Beach is integrated from, consists of 130,000 neuron analogs—hardware-wise, this is roughly equivalent to half of the neural capacity of a fruit fly. Pohoiki Beach scales that up to 8 million neurons—about the neural capacity of a zebrafish. But what’s perhaps more interesting than the raw computational power of the new neural network is how well it scales.

With the Loihi chip we’ve been able to demonstrate 109 times lower power consumption running a real-time deep learning benchmark compared to a GPU, and 5 times lower power consumption compared to specialized IoT inference hardware. Even better, as we scale the network up by 50 times, Loihi maintains real-time performance results and uses only 30 percent more power, whereas the IoT hardware uses 500 percent more power and is no longer real-time.

Chris Eliasmith, co-CEO of Applied Brain Research and professor at University of Waterloo

Pohoiki Beach appears to be step two of Intel’s process-architecture-optimization development model. Step three, a larger integration of Loihi chips to be called Pohoiki Springs, is scheduled to debut later this year. Neuromorphic design is still in a research phase, but this and similar projects from competitors such as IBM and Samsung should break ground for eventual commoditization and commercial use.