Derma_Promo_Stock
(Image courtesy of Stanford University).

LeapMind Wades into Battle over Embedded Brains

Tokyo-based start-up LeapMind is tackling one of the chip industry's trickier issues. It's trying to compress deep neural networks so that they fit into things like cameras that interpret what they see and headphones that translate a foreign language.

LeapMind recently raised $10 million from investors including Intel Capital to give it a fighting chance. With it, the company plans to further improve its technology platform called Juiz, which can condense the algorithms used in deep learning up to 500 times smaller without sacrificing accuracy.

The algorithms then can run inside embedded devices without hemorrhaging battery life and without communicating with the cloud, where the digital neurons are trained on things like internet search terms or human voices. After training, the algorithms typically run in data centers in a process called inferencing.

LeapMind, which was founded in 2012, is trying to move inferencing into edge devices. That would improve privacy and latency lost from talking with the clouds of Google, Amazon, and other corporations. LeapMind traces these algorithms onto FPGAs that can chew through less power than embedded CPUs or GPUs.

LeapMind said that the silicon can be used to prototype a custom chip – called an application-specific integrated circuit or ASIC – that runs even more efficiently. In the future, the company wants to start selling chips based on its own hardware architecture. For now, it plans to expand its sales team and hire engineering staff.

The funding round was led by Intel, which is betting on stranger silicon to sidestep the slowing pace of improvement in traditional chips and better compete with Qualcomm and Nvidia. Intel’s Movidius unit sells Myriad, a line of chips used in cameras that do things like detect cancerous moles on a person’s skin. 

LeapMind’s funding comes as technology companies like Apple, Google, Huawei, and Microsoft pour billions into custom silicon to handle deep learning in phones and augmented reality glasses. Taking advantage of the slowing of Moore’s Law, many start-ups are looking to sell these kinds of custom chips to a wider clientele.

They include Reduced Energy Microsystems, which gutted the clocks in its processors so that they could run at lower voltages. (The company was the first chipmaker accepted into the prominent start-up incubator Y Combinator). Mythic, which has raised $7.7 million in funding, makes chips that merge computer logic and memory to cut power consumption.

New chips from ThinCI – funded by Intel’s former chief product officer Dadi Perlmutter and venture capitalist Dado Banatao – can be installed in a network of security cameras to track packages or identify criminals. Tenstorrent, founded by former Nvidia and AMD engineers, is breeding a chip architecture that can be scaled for servers to sensors.

Horizon Robotics recently raised $100 million from Intel Capital and others to architect chips for autonomous cars to smart speakers. It was founded by Baidu’s former head of deep learning research Kai Yu, and has partnered with Bosch and factory robot maker Midea, among others, while working on its brain processing unit (BPU).

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish