Electronicdesign 24099 Adasky Coexist Promo

How Can Autonomous Vehicles and Pedestrians Safely Coexist?

Oct. 6, 2018
Only when autonomous vehicles can reliably detect and classify every object in their surroundings will they ever be able to operate safely amongst pedestrians—and that requires thermal sensors.

Download this article in PDF format.

Today’s autonomous vehicles may not have yet reached Level-5 autonomy, but they are, nonetheless, being deployed on public roads. In February, California’s Department of Motor Vehicles approved regulations to permit driverless testing on public roads. Even following the Uber crash in March, cities continue to willingly open their roads to testing. The Massachusetts Department of Transportation signed a Memorandum of Understanding in June to open their roads to autonomous vehicle testing. During these testing periods, in which autonomous vehicles will coexist with pedestrians, extra safety precautions must be taken. It’s particularly worrisome when autonomous vehicles are operating around pedestrians at night.

According to the Insurance Institute for Highway Safety, pedestrian fatalities increase the most after the sun has gone down. In fact, over three-quarters of all pedestrian deaths happen at night. The Uber crash from March of this year also occurred at night.

For autonomous vehicles to achieve full, Level-5 autonomy and successfully and safely coexist with pedestrians in cities, their sensing solutions must be infallible. But at this point, the sensors used by most automakers simply can’t deliver reliable detection and, therefore, can’t reach full autonomy.

Radar, LiDAR Not Yet Able to Deliver Full Autonomy

The reason fully autonomous vehicles have yet to take over our roadways is because their current sensors’ suites aren’t up to the task. Today, most automakers’ choices for sensing technologies—radar, cameras, and LiDAR—are unable to deliver complete coverage and detection in all scenarios. For this reason, a human driver must be ready at every moment to take control of the vehicle.

Many automakers outfit their vehicles with radar, as it can sufficiently detect objects at long range. However, radar can’t sufficiently identify these objects. Cameras are another commonly used perception solution, though they’re equally, if conversely, flawed. Unlike radar, cameras can accurately identify objects—but only at a close range. For this reason, many automakers use radar and camera together to provide more complete detection and coverage of vehicles’ surroundings: Radar will detect an object far down the road, and the camera will provide a clearer picture of it as it approaches.

Given the obvious flaws of radar and cameras, automakers also equip their autonomous vehicles with LiDAR (light detection and ranging) sensors. Like radar, LiDAR works by sending out signals and using the reflection of those signals to measure the distance between the vehicle and the object (radar uses radio signals; LiDAR uses lasers or light waves).

While LiDAR does provide a wider field-of-view than radar, it remains unfit as the ideal sensing solution for autonomous vehicles because, at present, it’s cost-prohibitive for mass-market applications. Although several companies are attempting to reduce the price of LiDAR sensors, their lower-cost sensors have much lower resolution and, therefore, can’t provide the detection and coverage needed to reach Level-5 autonomy.

Danger: Sensors that Can Detect but Not Classify Objects

To compensate for each sensing technology’s respective weakness, automakers often outfit their AVs with several different solutions in a redundant mosaic of sensors. In this practice, where one sensor may fail at detection, the other(s) can back it up.

This may seem like a satisfactory solution and one that could ultimately deliver full autonomy, but there remains a key obstacle: What happens when these sensors can successfully detect an object, but can’t correctly classify it?

Consider the Uber crash: According to a report from the National Transportation Safety Board, the vehicle detected the pedestrian six seconds before the accident, but the Autonomous Driving System classified the pedestrian as an unidentified object; first as a car and then as a bicycle. In other words, the vehicle’s sensors detected the victim, but the software wrongly determined that she wasn’t in danger and that no evasive action was required.

This is a tragic example of an autonomous vehicle declaring a false positive. A false positive is when an autonomous vehicle successfully detects an object, but wrongly classifies it. The software in autonomous vehicles is programmed to ignore certain objects, like an errant plastic bag or newspaper flicking across the street. These accommodations must be made for autonomous vehicles to drive smoothly, especially on high-speed roads.

However, Uber’s fatal incident proves that building a machine-vision algorithm that can intelligently consider which detected objects are at danger from, or are a danger to, the vehicle is a challenge—and one of dire importance. With pedestrians coexisting on roads with autonomous vehicles, there’s no room for error in classification of objects, as any error is extremely dangerous and possibly fatal.

Until the detection and classification process of today’s sensing solutions improves, autonomous-vehicle software will continue to be confounded by the threat of false positives. We may never see the deployment of Level-5 autonomous vehicles if autonomous vehicles will not be able to operate safely and reliably in cities amongst pedestrians.

Thermal Sensors Detect and (Correctly) Classify Objects at the Same Time

A new type of sensor using far-infrared (FIR) technology can provide not only the complete detection of an AV’s surroundings, but also the reliable classification of every object in an AV’s surroundings. It’s only with this accurate, simultaneous detection and classification that full autonomy can be realized.

Thermal sensors are able to deliver accurate detection and classification because they function differently from any other sensing solution. Unlike radar and LiDAR sensors that must transmit and receive signals, a FIR camera simply senses signals from objects radiating heat, making it a “passive” technology. Because they scan the infrared spectrum just above visible light, FIR cameras generate a new layer of information, detecting objects that may not otherwise be perceptible to a camera, radar, or LiDAR.

Besides an object’s temperature, FIR cameras also capture an object’s emissivity—how effectively it emits heat. This allows a FIR camera to sense any object in its path, since every object has a different emissivity. Most importantly, this enables thermal sensors to immediately detect and classify whether the object in question is a human or an inanimate object.

With this information, a FIR camera can create a visual painting of the roadway at both near and far range. Thermal FIR also detects lane markings and the positions of pedestrians (e.g., which direction they’re facing) and in most cases can determine if a pedestrian is going off the sidewalk and about to cross the road. The vehicle can then predict if there’s risk of hitting the pedestrian, helping to avoid the challenge of false positives. It also enables AVs to operate independently and safely in any kind of environment, whether it be urban or rural, during the day or night.

FIR has been used for decades in defense, security, firefighting, and construction, making it a mature and proven technology. Currently, there are three leading FIR sensor companies: Autoliv, FLIR Systems, and AdaSky.

A side-by-side comparison of Viper versus an advanced full HD dashcam detecting pedestrians at night.

AdaSky is an Israeli startup that recently developed Viper, a high-resolution thermal camera that passively collects FIR signals, converts it to a high-resolution VGA video, and applies deep-learning computer vision algorithms to sense and analyze its surroundings (see figure). In fact, with Viper and its machine learning, the Uber vehicle from the March crash would still have had at least six seconds of detection. Instead of mistaking the victim for another vehicle or a bicycle, Viper would have instantly detected and classified her as a pedestrian, giving the vehicle nearly four seconds to break.

Despite their proven proficiency, a lot of OEMs are currently evaluating thermal sensors but are wary of the cost for mass-market use. Historically, legacy companies offering FIR sensing have been reserved exclusively for luxury brands. But newer sensor companies have developed a technology that’s scalable for the mass market.

It’s only when autonomous vehicles are able to reliably detect and successfully classify every object in their surroundings that they will ever be able to operate safely amongst pedestrians. Only thermal sensors can deliver this level of reliability.

Yakov Shaharabani is CEO and Board Member for AdaSky.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!