As the pace of processor development slows, many companies are betting that custom silicon can cut the cost of machine learning in embedded devices and give them independence from the internet. But even though millions of dollars are pouring into new chips, some argue there is nothing wrong with existing hardware.
The problem is that software is too rough around the edges, and increasingly investors are onboard with startups trying to change that. Seattle, Washington-based XNOR, which has designed neural networks that consume a fraction of the memory and power, announced on Tuesday that it raised $12 million in venture funding.
Founded by former scientists at the Allen Institute for Artificial Intelligence, XNOR is trying to trim the fat from machine learning models so that they run on hardware as simple and low cost as the Raspberry Pi. That puts it directly in the path of companies creating custom chips that can accelerate neural networks and could cost much more than existing chips.
XNOR is also trying to develop a software platform that allows anyone to integrate state-of-the-art inference models into security cameras, drones and other devices. The toolkit is scheduled to be released before the end of the year, and the company has partnered with semiconductor companies, including Ambarella, to make the algorithms compatible with their products.
“Our ‘A.I. everywhere for everyone’ technology eliminates the need for internet connectivity, runs on inexpensive hardware platforms and eliminates latency inherent in traditional cloud based A.I. systems,” said Ali Farhadi, founder and chief executive of XNOR, which previously raised $2.6 million in seed funding.
Taking machine learning out of the cloud would allow drones to scan farmland to pinpoint failing crops and recommend optimum harvest time without being connected to the internet, said XNOR. Smartwatches could measure vital signs without wasting energy to send raw data to the cloud, and smart speakers could perform simple voice recognition and control functions.
The transportation industry could also enlist XNOR's technology. “XNOR's lightweight algorithms will enable A.I. on the edge for a broad range of applications, from cars to fleet management to smart infrastructure,” said Alexei Andreev, a managing director of Autotech Ventures, which joined Madrona Venture Group, NGP Capital and Catapult Ventures in the funding round.
The latest funding puts the startup on the same level as other companies in this business. In January 2017, Neurala raised $14 million to fund the development of off-the-shelf algorithms that can be installed in toys, security cameras and self-driving cars. The company’s approach streamlines neural networks, and its software is already used by customers like Parrot and Teal Drones, among others.
Qualcomm, the largest maker of mobile chips, abides the same philosophy. It has given customers access to its neural processing engine, a software tool that splits machine learning programs into simpler parts and matches them to the cores inside its general-purpose chips. The company has shelved plans to follow in Apple and Huawei’s footsteps, shipping custom cores in its products.