(Image courtesy of Toyota).

NXP Builds Computer Engine for Self-Driving Cars

May 17, 2016
NXP has introduced a computing platform to help automakers build and test self-driving cars. It combines multiple streams of sensor data, forming a three-dimensional model of the vehicle's surroundings.

Several years ago, computers that could make sense of machine vision and other sensor data must have appeared to automakers like black boxes—completely inscrutable save for their applications. Now, NXP Semiconductors has introduced an open computing platform a few shades lighter than black, called BlueBox, to help automakers build and test self-driving cars.

BlueBox is a central computer engine fueled by the bits streaming in from sensors around the vehicle. It knits together all the different sensors found in autonomous cars, including radars, cameras, and Lidar systems. Multiple streams of sensor data are routed to BlueBox, which combines them into a three-dimensional model of the vehicle’s surroundings.

Computers that process sensor data in real-time can vastly improve how autonomous cars make decisions on the road, whether that involves slowing down ahead of traffic jams or slamming on the brakes when a person steps into the road. In the near term, BlueBox could be tested to enhance advanced driver-assisted systems that alert drivers in dangerous situations, like automatic braking and blind-spot warnings.

At the same time, the engine contains other technologies that could eventually transfer greater control to self-driving cars. According to NXP, BlueBox incorporates “the embedded intelligence and machine learning required for complete situational assessments, supporting advanced classification tasks, object detection, localization, mapping and vehicle driving decisions.” The platform also includes an onboard system for sharing position data with other cars on the road.

At the heart of the system are two automotive processors. One part is the S32V automotive vision processor, a holdout from Freescale that survived the recent merger with NXP  Semiconductors. Linked with a graphics processor unit, the vision system can identify images from cameras located around the vehicle, extract features to classify objects, and create three-dimensional models.

While the vision processor was developed with Cognivue, an image cognition company that Freescale purchased last year, the other half of BlueBox comes from NXP’s embedded catalog. The platform employs an embedded computer processor built with eight 64-bit ARM cores, providing BlueBox with 90,000 million instructions per second of performance. The system is on display this week at the NXP FTF Technology Forum in Austin.

BlueBox is already being tested with four of the five largest automobile companies in the world, NXP said. The company would not specify which automakers were using the platform—but the five largest in 2015 were Toyota, Volkswagen, Germany’s Daimler, BMW, and Honda.

BlueBox could help automakers not develop advanced safety features but also take significant steps toward fully autonomous cars as early as 2020, according to NXP's claims. Merging sensor data with machine learning, BlueBox has all the elements necessary for “Level-4” autonomous vehicles, as defined by the Society for Automotive Engineers. Level-4 vehicles can handle all aspects of driving, from turning onto the highway or slugging through a traffic jam.

Level-5 vehicles are the holy grail, giving the vehicle total control, so that even the steering wheel and manual brake can be completely removed. Tesla models equipped with Autopilot are among Level-3 vehicles, which can drive on highways without human intervention. Nevertheless, drivers in Level-3 cars must be prepared to grab the wheel and take control at any moment.

BlueBox is one of NXP’s answers to computer vision platforms from Mobileye and Nvidia, which has tried to position its graphics chips and processors as the eyes and brains of self-driving cars. Last year, Nvidia revealed the Drive PX platform for fusing sensor data from around the vehicle. It also combines deep neural networks—programs that attempt to simulate the structure and operation of the human brain—to detect and classify objects in the road. 

NXP Semiconductors, which claims to have shipped around 30 million advanced driver-assisted systems to date, has also branched out into other products for autonomous cars. In January, the company introduced a new radar transceiver that could replace the ultrasonic radars widely used in emergency braking and other safety systems. Google X is now testing the chip in its self-driving cars.

Unlike a closed black box system, the BlueBox platform was designed so that automakers could build layers of software and machine learning on top of it. That aligns with their broad efforts to protect against companies like Apple and Google, which are threatening to turn automobiles into little more than containers for advanced software.

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!