Self-driving cars are a work in progress. They employ a myriad of technologies from radar to machine learning. As such, more systems must work properly to allow the vehicle to deliver its passengers to their destination intact and not collide with obstacles along the way.
Autonomous vehicles are supposed to prevent car accidents. Unfortunately, they happen anyway. Designing and delivering a self-driving car requires exhaustive testing, and there’s the rub. Much of the testing of these systems has been done in the field. Initially, most of this was done in restricted areas where accidents may damage property but not people.
Lately, there’s been a push to test in the real world. This may be premature, but the alternative—simulation—has been lacking. The problem is that simulation of this kind must be extensive and sophisticated, and in turn requires extensive compute resources and complex software. Things become even more complex when one considers hardware in the loop (HIL).
Robotic simulations have been around for quite some time. However, they often simplify the simulated environment and ignore many aspects that are critical to automotive simulations, such as the infrared and radar response of objects and sensors that are becoming common in autonomous vehicles. Likewise, resolution and quality of feedback within these systems often is insufficient to match the requirements of these vehicles.
Raising the Simulation Bar
A number of companies are addressing this simulation environment, including NVIDIA and Siemens. NVIDIA’s solution is the DRIVE Constellation Simulation System, while Siemens’s solution is part of its Simcenter portfolio that includes the TASS Prescan virtual sensory imagery. Both companies also have self-driving car hardware and software platforms integrated into their systems with HIL support.
NVIDIA’s DRIVE Constellation Simulation System was announced at its annual Graphics Technology Conference (GTC) in San Jose, Calif. DRIVE Constellation consists of two main components. The first is the DRIVE Sim software—it provides the simulation environment that includes sensor support for cameras, LiDAR, and radar. The second is the DRIVE Pegasus system, which incorporates NVIDIA hardware and software. The hardware is used with the HIL support, or this support can run in software as well. Of course, the simulation environment is powered by NVIDIA GPGPUs.
Siemens released its solution at its U.S. Innovation Day in Chicago. TASS Prescan has been available for developing and validating advanced driver-assistance systems (ADAS) and active safety systems. This includes HIL as well as model-in-the-loop (MIL) and software-in-the-loop (SIL) testing. The system works with Siemens’ Mentor DRS360 platform for autonomous vehicles. DRS360 is built on Xilinx Zynq UltraScale+ MPSoC FPGAs and includes Mentor’s software stack for autonomous vehicles.
Siemens has also partnered with Cepton Technologies to provide physics-based LiDAR modeling support. Cepton is an innovative Silicon Valley-based company noted for its long-range, small-footprint LiDAR sensors. Siemens will be working with other vendors to integrate their sensor systems into the development environment.
Extrapolating Real-World Testing
Both systems allow developers to create scripts that control the environment, enabling vehicles to be tested under a variety of conditions from a clear day to a snowstorm. It allows hardware and software to be tested on millions of miles of travel that would not be possible with real-world travel. Likewise, developers can simulate any condition repeatedly, whereas a real-world scenario may only occur for a short duration, such as glare from a rising or setting sun.
Most aircraft and many other vehicles are designed and tested extensively using simulation, oftentimes before real systems are constructed. That hasn’t occurred to the same degree with autonomous vehicles for numerous reasons, including availability of the simulation solutions just mentioned as well as the hardware needed to run these simulations. Availability of more advanced simulation software and new GPGPU and processor systems like those recently announced by NVIDIA make this type of sophisticated simulation practical.
The big question will be who will adopt them and how extensively they will be used? Should field testing be limited until more simulation can be done? Will system validation be done using simulation? These questions and more are still pending, but the availability of advanced simulation systems adds much more weight to this discussion.