When Intel announced its $15.3 billion bet on Mobileye, it raised questions about how the chip makers would split up computing jobs in self-driving cars. But according to Intel's chief executive Brian Krzanich, the deal is less about hardware than Mobileye's ability to collect mapping and driving data from thousands of vehicles.
"Our strategy is to make Intel the driving force of the data revolution across every technology and every industry,” Krzanich said in a letter to employees. "We are a data company," he added.
Mobileye's chips interpret camera images to enable blind spot warning and lane change assist on highways. But it has also started laying out chips optimized for sensor fusion and machine learning programs that teach cars to make split-second decisions on city streets. It also make software for generating high-definition road maps that can be shared.
Intel views the self-driving cars as an extension of its data center business, as vehicles will need increasingly powerful computers to crunch data and make decisions on the road. In the view of industry analysts, Mobileye's software will allow the company to control more of what goes under the hood of self-driving cars.
"Tech firms are hunting for ever more data. Miles = data," said Adam Jonas, a Morgan Stanley financial analyst, in a research note after Intel announced the deal.
Mobileye, founded in 1999 in Israel, is aiming to trade in that information. Its latest generation of chips place an emphasis on stitching together multiple cameras with radar and lidar sensors. Through exposure to the sensor data, sophisticated algorithms teach cars to observe the rules of the road, or what the company calls "driving policy."
Amnon Shashua, Mobileye's founder and chief executive, has also focused on how to share that information between manufacturers to improve safety. To that end, it has developed software for road experience management, which is able to capture road markings and traffic information from cars equipped with front-facing cameras. After the information is compressed, it is sent to the cloud.
The cumulative data is assembled into a vast road map called Roadbook, which gives cars a more detailed view of their surroundings and creates a safety net for their other sensors. Mobileye, which has signed deals with Volkswagen and BMW to use the technology, is aiming to map all the roadways in the United States by 2018.
"That data is extremely critical and it takes time to replicate," said Kevin Krewell, a principal analyst at Tirias Research. It could have taken two or three years — an entire design cycle in the automotive market — for Intel to build a similar database on its own, he said. The future of Mobileye's hardware is less clear right now.
Krzanich said that the company would sell that information and wrap it into services for automakers with fewer technological chops for self-driving cars. In that way, Intel is trying not only to compete with rivals like Nvidia and Qualcomm, but also to provide an alternative to Google, which has spun out its autonomous driving project into a new company.
Intel has telegraphed its appetite for data in the last year, buying 15% of the digital mapping firm Here, which is owned by a group of German automakers including BMW and Volkswagen. Mobileye completed a separate deal in late December to swap data with Here, which will have access to the camera imagery to be used in Roadbook.
“Every autonomous car out there shouldn’t have to find the same pothole and log it,” said Kathy Winter, general manager of Intel's automated driving solutions unit and a former vice president of automated driving software at Delphi, in a February blog post.
Intel faces stiff competition from its peers in the chip industry. Nvidia is tuning its graphics chips for what it calls "end-to-end" deep learning, in which software teaches itself how to drive with little human instruction. Nvidia's engineers have also been working on software to read driver’s lips and alert drivers who are falling asleep.
Its algorithms learn to drive in lanes, for instance, by watching the road through a front-facing camera, said Danny Shapiro, Nvidia’s director of automotive, in a recent presentation at the South by South festival in Austin, Texas. The firm has entered deals with many automakers, including one with Audi to catalog street signs in Germany.
Shapiro said the company's algorithms had become so advanced that engineers didn't have to rely as much on sensors for the tricky task of debugging deep learning software. Its self-driving software is based on neural nets with around 100 layers, so Nvidia's engineers can locate bugs in individual layers and make changes, he told Electronic Design.
There are other signs that the center of gravity in self-driving cars is moving toward intelligent algorithms. Rick Clemmer, the chief executive of NXP Semiconductors, said that the company was selling itself to Qualcomm for $47 billion partly because it had fallen behind in machine learning. Industry analysts said that it had focused so much on sensor fusion because it required the most silicon.
Machine learning is also a big part of Intel's automotive strategy, along with wireless connectivity, vision processing, and software for over-the-air updates. The company has also moved onto other architectures to handle the needs of self-driving cars: Its Go automated driving system uses FPGAs accelerators.
But both sides on the Mobileye deal said that hardware is ultimately a component of a much broader autonomous driving system. “When you talk about a solution then the position of the silicon, although it is important, becomes much, much smaller,” Shashua said in the investor call last week.
“There is much more than silicon going on here.”