Mythic, a startup tapping into the demand for dedicated artificial intelligence processing in factories, drones, cars and other embedded systems, has added another $30 million in venture capital. The funding came from existing investors SoftBank, Threshold Ventures and Lux Capital, among others, as well as new investors including Micron Technology and Lam Research.
The funding brings its total amount raised to $86 million as it plans to start shipping its intelligence processing unit (IPU) to early customers in the fourth quarter of 2019. Mythic's IPU is designed to perform analog processing inside flash memory, handling artificial intelligence at higher performance than general purpose GPUs and CPUs. Combining compute and memory also limits data movement, lowering power consumption.
Mythic was founded by chief executive officer Mike Henry, who leads software engineering in Redwood City, California, and chief technology officer Dave Fick, who heads the hardware development team in Austin, Texas, in 2012. The additional funding “underscores our belief that Mythic will play a leading role in the explosive deployment of A.I. products both in the data center and at the edge," Henry said in a statement.
While Nvidia GPUs are the current gold standard for training neural networks—the fundamental building block of machine learning—Mythic IPUs are targeting inferencing. Today training and inferencing are both typically done inside data centers, where power and memory is limitless. Mythic and many other semiconductor industry players are trying to take inferencing out of the cloud and push it instead out to the network edge.
Mythic could struggle to stand out in an increasingly crowded market. Nvidia is doubling down its investments on inferencing. Qualcomm, NXP Semiconductors and other major players in the semiconductor space are investing in inferencing. Arm plans to introduce chip designs dedicated to artificial intelligence. Mythic could also face challenges from startups Habana, Horizon Robotics, Thinci and many, many others.
Shipments of edge computing chips tuned for artificial intelligence will more than double to 340.1 million units in 2019, according to estimates by market researcher IDC. Last year, around 5% of embedded systems running artificial intelligence jobs used such chips. But the rate is projected to reach 40.5% in 2023, IDC estimates. Shipments are set to increase to 1.5 billion units by 2023, representing a roughly $40.4 billion market.
"The success of AI resides in systems deployed to the edge, where instant decisions made by neural networks can actually create value, unfettered by latency and connectivity issues that can challenge cloud-based solutions," Michael Palma, an emerging technology analyst at IDC, said in a statement. "But the promise of AI at the edge depends on the development of highly efficient compute processing elements," Palma said.