(Image courtesy of Thinkstock).
(Image courtesy of Thinkstock).
(Image courtesy of Thinkstock).
(Image courtesy of Thinkstock).
(Image courtesy of Thinkstock).

Wading into the Battle over Embedded Brains

Oct. 31, 2017
LeapMind is a company trying to compress deep neural networks so that they fit inside custom chips embedded in everything from cameras and headphones.

LeapMind is trying to compress deep neural networks so that they can fit inside custom chips embedded in everything from cameras that interpret what they see and headphones that translate what they hear into other languages.

And the company, based in Tokyo, Japan, recently raised $10 million from investors including Intel Capital to give it a fighting chance. With it, the company plans to further improve its technology platform, called Juiz, which makes the algorithms used in deep learning five hundred times smaller without sacrificing accuracy.

The algorithms then can run inside embedded devices without hemorrhaging battery life and without communicating with the cloud, where they are trained on vast amounts of data like internet search terms or human speech. After training, the algorithms typically run in data centers in a process called inferencing.

LeapMind, which was founded in 2012, is trying to move inferencing into edge devices. That would improve privacy and preserve latency lost from talking with the cloud, which are usually owned by Google, Amazon, and other corporations. LeapMind traces these algorithms onto FPGAs that can chew through less power than embedded CPUs or GPUs.

LeapMind said that the silicon can be used to prototype a custom chip – called an application-specific integrated circuit or ASIC – that runs even more efficiently. In the future, the company wants to start selling chips based on its own hardware architecture. For now, it plans to expand its sales team and hire engineering staff.

The funding round was led by Intel, which is also betting on strange silicon to sidestep the slowing pace of improvement in traditional chips. Intel’s Movidius unit sells Myriad, a line of chips used in cameras that can, for instance, detect cancerous moles on a person’s skin. It is also planning to sell a server chip to compete with Nvidia’s accelerator chips for deep learning.

Technology companies like Apple, Google, Huawei, and Microsoft are also pouring billions of dollars into custom chips that handle deep learning tasks in smartphones and augmented reality glasses. And start-ups are taking advantage of the slowing pace of Moore’s Law to make a convincing case that deep learning chips can be used in embedded devices.

They include Reduced Energy Microsystems, which has gutted the clocks in its processors to make them run at lower voltages, and which was the first chipmaker accepted into the start-up incubator Y Combinator. With $7.7 million in venture capital funding, Mythic is aiming to sell chips that merge computer logic and memory to cut power consumption.

Horizon Robotics recently raised $100 million from Intel Capital and others to architect chips for everything from autonomous cars to smart speakers. The firm was founded by Baidu’s former head of deep learning research Kai Yu, and it has partnered with Bosch and others while putting the finishing touches on its brain processing unit (BPU).

New chips from ThinCI – funded by Intel’s former chief product officer Dadi Perlmutter and venture capitalist Dado Banatao – can be installed in a network of security cameras to track packages or identify criminals. Tenstorrent, founded by former Nvidia and AMD engineers, is creating a chip architecture for servers that can also be scaled down for sensors.

Sponsored Recommendations

Highly Integrated 20A Digital Power Module for High Current Applications

March 20, 2024
Renesas latest power module delivers the highest efficiency (up to 94% peak) and fast time-to-market solution in an extremely small footprint. The RRM12120 is ideal for space...

Empowering Innovation: Your Power Partner for Tomorrow's Challenges

March 20, 2024
Discover how innovation, quality, and reliability are embedded into every aspect of Renesas' power products.

Article: Meeting the challenges of power conversion in e-bikes

March 18, 2024
Managing electrical noise in a compact and lightweight vehicle is a perpetual obstacle

Power modules provide high-efficiency conversion between 400V and 800V systems for electric vehicles

March 18, 2024
Porsche, Hyundai and GMC all are converting 400 – 800V today in very different ways. Learn more about how power modules stack up to these discrete designs.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!