Texas Instruments
6840aa7cfc77a05f4612c7d8 Promo Autodrivingcartruckhero

Autonomous Vehicles Deliver Advanced Safety and Convenience

June 12, 2025
Sponsored by Texas Instruments: AVs offer a huge technological “wow” factor, but they must provide a safe and convenient ride. This article examines technologies like ADAS and advanced semiconductors, and how AI-enabled sensor fusion brings it together.

Members can download this article in PDF format.

The rise of autonomous vehicles (AVs) is poised to redefine the transportation landscape. In the transition from a futuristic concept to everyday reality, these driverless cars will have far-reaching effects on how the driver will interact with them. Eventually, once the glamor wears off, to be accepted by the masses, AVs will need to provide an experience as seamless as using your smartphone.

In general, several trends are emerging in the AV space, but the two main overarching directives are safety and convenience. Safety stands alone, but convenience has multiple components.

One trend is the evolution of advanced exteroceptive sensors such as solid-state and 3D LiDAR, and frequency-modulated continuous-wave (FMCW) radars, AI-integrated cameras, and ultrasonic. Similarly, the transformation continues in the advanced chip ecosystem, i.e., embedded processors with edge AI, intelligent application-specific integrated circuits (ASICs), purpose-built systems-on-chips (SoCs), and microcontrollers (MCUs).

Sponsored Resources:

On the safety side, there are chips designed specifically to meet the latency, power, and performance requirements of AVs according to functional-safety standards like NCAP and R79, ISO 26262, and AEC-Q100/200. These define various automotive safety integrity levels (ASILs) and specifications. They are generally implemented by dual-core lock-step MCUs. For systems that require lower safety integrity levels, e.g., ASIL B, a simpler and lower-cost combination of safety mechanisms, such as error correcting codes (ECC), and built-in self-tests (BIST) can be used.

Scalability is another objective. These devices must be applicable from the entry level to luxury vehicles.

Obviously, this comprises many components, but the main one is driver safety, which is largely supported by advanced driver-assistance systems (ADAS). In fact, by 2050, ADAS is projected to prevent as many as 37 million1 accidents alone.

ADAS is All About the Drive

ADAS is a complex and sophisticated data-acquisition and processing system that functions as the eyes and ears of the AV. The system consists of several essential components, including an electronic control unit (ECU), sensors, cameras, software, radar, LiDAR, artificial intelligence (AI), and interfaces.

Those elements work together to collect and analyze vast amounts of data which, when married to sensor fusion, creates an intelligent, self-driving vehicle.

Sensor Fusion and ADAS: The Ultimate Solution

Sensor fusion is an integral component of ADAS. It’s AI-enabled software that manages the complex data received from sensors. Together with advanced MCUs and SoCs, sensor fusion enhances ADAS capabilities, such as forward collision warning (FCW), autonomous emergency braking (AEB), collision avoidance, blind spot detection (BSD), self-parking, lane-keeping, and more. It enhances safety and convenience and improves the overall driving experience. 

Advanced sensors are able to acquire data rich in content with extremely fine detail from radar, LiDAR, inertial sensors, ultrasonics, the Global Navigation Satellite System (GNSS), and more (Fig. 1). Sensor fusion collectively aggregates, analyzes, and weights these sensor streams.

That amount of voluminous data requires fast and wide data paths and processing devices. This is where state-of-the-art ICs come in, such as the TI FPD-Link SerDes serializers and deserializers. These wideband ICs deliver uncompressed video, power, and control data over a single, low-latency cable to the MCUs such as the Jacinto 7. Such wide data paths and next-generation silicon will push AVs to the next levels. 

Sensor fusion comes in two flavors—early and late. With early or low-level fusion, the raw data from different sensors is combined before any high-level processing or decision-making. The late or high-level fusion processes the data from each sensor separately before combining the results. For example, a camera (low-level) would identify a pedestrian, while LiDAR (high-level) would determine where the pedestrian’s location and where they’re going. 

Both approaches have their merits, and many advanced self-driving systems use a mix of early and late fusion strategies to get the best performance.

Going forward, sensor fusion will be advanced by integrating new technologies, such as next-level multisensory data fusion, big data processing, 3D radar, 3D sonar, and deep learning. This will advance AVs to future levels of smart vehicles.

ADAS Convenience

The capabilities of ADAS not only improve driving safety, but convenience as well. While reading about the many ADAS convenience functions is impractical here, one that stands out is parking assistance (Fig. 2). While parking assist may not be the most glamorous capability of ADAS, who hasn’t struggled with squeezing into a parking spot with only inches to spare? ADAS promises to rid the world of manual parking.

There are two types of AV parking assistance— passive and active. Passive systems alert drivers to parking hazards via audible and tactile sounds. This consists of distance alerts using camera sensors. The driver operates the car as usual, paying attention (or not) to alert sounds, but still parks the old-fashioned way manually controlling steering and braking. This only alerts the driver to impending contact with things around them.

Active systems are true hands-off. Parallel, perpendicular, and reverse-in options are available, depending on who makes what vehicle (vendor specific, currently).

Active parking assistance assumes control of the vehicle’s brakes, steering, acceleration, and gear positions. Active parking uses the same sensor as driving. However, because parking obstacles are a few feet (other vehicles) to a few inches (curbs) away, the analysis and decision made by the various systems are different.

Summary

Autonomous driving’s goal is to deliver consistently safe, predictable, uneventful rides. While this technology is exciting, and ADAS and sensor-fusion technology are advancing quickly, its capabilities will be way ahead of AV deployment. We will continue to see advanced use case of level 3 emerging by the end of this decade, and level 4 by the mid to late 2030.2 Predictions for level 5 are scarce beyond that.

However, even as the technology reaches advanced levels, many impediments along that path aren’t technology-based. These are regulatory and legal, smart infrastructure, security, and vehicle-to-everything (V2X), which are still in the early stages. So, the cars of the science-fiction realm are still a long way off.

References

1. https://aaafoundation.org/wp-content/uploads/2023/07/AAAFTS-Safety-Benefits-of-ADAS.pdf

2. WEF_Autonomous_Vehicles_2025.pdf

Sponsored

Stricter guidelines imposed by version 3 of the IEC standard for harmonic current emissions push designers to embrace power-factor-correction methodologies.
The flyback topology is a versatile, widely used, switched-mode power-supply design with some interesting characteristics that brings performance and BOM advantages to many applications...
High output current, also referred to as Iout, might be an important spec for your low-dropout (LDO) linear regulator.
Designing power supplies that work from wall outlets has multiple challenges. This FAQ should answer many of your questions.