ITC keynoter touts brain-inspired computing

Oct. 7, 2015

Anaheim, CA. Brain-inspired computing was the topic of a keynote address by Karim Arabi, vice president of engineering at Qualcomm, at the International Test Conference Tuesday. The mission, he said, is to make everything around us smarter and connected.

At one time, Arabi said, people bought phones primarily based on their audio quality, but expectations today are much more diverse. We want our devices to be always on and always aware, he said, using sensors and other mechanisms to observe activities, discover patterns, and make decisions—they must anticipate, predict, and alert.

He turned his attention to computing at the cloud and the edge. Big data and abundant computing power are tending to push computation into the cloud, he said, but with the deployment of the Internet of Things, we can’t afford to send all the data into the cloud. Consequently, demand for processing horsepower at the edge is increasing. And for efficiency, we need the right machine for the right task.

He then cited some key specs: the human brain can store 3.5 petabytes and operates at 20 petaFLOPS while requiring 20 W of power. In contrast, the IBM Sequoia can store 1.6 petabytes, operates at 16.3 petaFLOPS, and consumes a massive 7.9 MW, nearly 400,000 times the power required by the brain. The brain, he said, is efficient because it as a massively parallel machine with 86 billion neurons—“it’s the most interconnected machine in the world.”

He added that the brain has no system clock—it’s event-driven. There is no hardware/software distinction, and the same components handle processing and memory. And in contrast to our designs, which start out simple and become more complex, the brain continuously simplifies itself. Further, while computer multicore scaling improvements top out around 10 cores, our brain operates with millions of “cores” in parallel.

The brain is in fact a heterogeneous machine, he said, consisting of structures such as the frontal lobe, cerebellum, motor cortex, sensory cortex, temporal lobe, and occipital lobe; further, the left brain deals with logic, math, and science, while the right brain handles feelings, art, and philosophy. Large sets of neurons, he said, are specialized to operate in each area, but within each area operations are massively parallel.

Consequently, he said, multicore and heterogeneous computing constitute the first step toward brain-inspired computing. The second step is deep learning. The third is approximate computing, tolerating errors, with fault tolerance. The fourth and final step, he said, is neuromorphic computing, which “…excels at computing complex dynamics using a small set of computational primitives.”

Where does this lead? Our brains are not well equipped to understand sustained exponential growth, he said, recounting the story of the inventor of chess, who asked to be compensated with grains of rice—one on the first square, with the number on each succeeding square doubling the count on the preceding square. The 32nd square would need to hold 4 billion grains—a big number.

But the 64th square would have to hold 18 quintillion grains, which, as Arabi commented, would require more volume than Mt. Everest.

At this point, he said, we are moving onto the second half of the chessboard.

Can brain-inspired computing ever rival the performance of the human brain? In an approach to an answer, Arabi discussed concepts such as Hebbian learning (“neurons that fire together wire together, neurons that fire out of sync lose their link”) and spike timing dependent plasticity (STDP).

But, he concluded, “The real challenge for neuromorphic computing is not technology but training.” It takes people 30 years to earn a Ph.D., he said, adding that you can’t put a product on the market and wait 30 years for it to learn what it needs to know.

He also conceded that the human brain is not very good at brute-force computing; it is not very good at, for example, multiplying together two large integers.

As a result, he said, “The best solution is a combination of traditional computing for brute-force calculations and brain-inspired computing for complex decisions and pattern generation.” Nevertheless, he concluded, “The computer will eventually outpace our intelligence and logic in all possible ways.”

About the Author

Rick Nelson | Contributing Editor

Rick is currently Contributing Technical Editor. He was Executive Editor for EE in 2011-2018. Previously he served on several publications, including EDN and Vision Systems Design, and has received awards for signed editorials from the American Society of Business Publication Editors. He began as a design engineer at General Electric and Litton Industries and earned a BSEE degree from Penn State.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!