(Image courtesy of Volvo).

Tesla Driver Involved in Fatal Crash with Autopilot Engaged

July 2, 2016
A Tesla driver was involved in a fatal crash after the electric vehicle's automated driving software failed to react to another car on the road. Highway regulators have opened an investigation into the accident.

Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times." While driving with Autopilot engaged, "you need to maintain control and responsibility for your vehicle.” “Always keep your hands on the wheel. Be prepared to take over at any time.”

These are the warnings that appear on the dashboard of the Tesla Model S before drivers can engage its autopilot mode, and while drivers are using it on the highway. They are also the warnings that the company said were meant to protect the Tesla driver involved in a fatal crash in May, when the electric vehicle’s self-driving software failed to react to another car on the road.

The accident occurred on May 7 in Florida, according to federal regulators, who have opened a formal investigation into the crash. The National Highway Traffic and Safety Administration, which Tesla notified about the accident, is looking at whether the electric vehicle’s “autopilot” mode was at fault.

According to Tesla, the accident was the result of “extremely rare circumstances” while the electric vehicle was driving in autopilot mode and the driver failed to notice a tractor-trailer ahead of his vehicle. The collision occurred as the tractor-trailer was making a left turn in front of the vehicle, and the car failed to apply the brakes.

Without naming the victim, Tesla called the accident “a tragic loss” in a statement. The automaker described him as “a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission.”

The driver was later identified by the Florida Highway Patrol as Joshua Brown, the owner of a technology consulting firm in Ohio. He appeared to be an enthusiastic supporter of Tesla and semiautonomous driving, recording videos of himself using the Autopilot feature.

The Silicon Valley automaker said that it had informed regulators about the accident “immediately” after it occurred, though it only acknowledged the incident once the investigation became public on Thursday.

Many companies, including Google and Tesla, have been aggressive in testing cars that combine computers, sensors, and radar to drive themselves on highways. Ford, Volvo, and other traditional automakers are pouring large sums into autonomous driving, and sometimes – in the case of General Motors buying Cruise Automation – paying billions for software engineering.

Most of these companies are testing fleets of semiautonomous vehicles on closed circuits or actual roads, adding millions of miles to their odometers, and using the distance traveled as evidence of their success. Tesla, for example, said in a news release that the crash “is the first known fatality in just over 130 million miles where Autopilot was activated.”

But the latest accident has raised questions about whether the industry’s eagerness has outpaced the technology, and whether driving long distances in enough to ensure that autonomous cars are ready for the real world. Most companies have taken the stance that, by taking driving out of human hands, autonomous cars will makes driving safer and reduce accidents.

Tesla’s Autopilot software, which was released last year as a software update in certain vehicles, is meant only for highway driving. On highways, there are fewer distractions for the sensors and cameras giving the vehicle a detailed view of its surroundings. The technology enables cars to make automatic lane changes, detect other vehicles, and apply the brakes before a collision occurs.

In the latest accident, the brakes were never applied. The Model S was driving with Autopilot on a divided highway when a tractor-trailer drove perpendicular to the vehicle. “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied,” Tesla explained in a statement.

Adding to Autopilot’s confusion was the height of the trailer, which caused the sensors to ignore what was actually in the road and the Model S to drive under the trailer. Had the vehicle been heading for the front or back wheels, the system would have engaged the brakes, the company said.

The accident might be an occasion for Tesla to reflect on its automated driving technology, which it plans to update gradually in its vehicles. According to traffic regulators, the Autopilot feature is classified as Level 2 autonomous driving, which can drive on highways but requires humans to handle more complex tasks like stopping for pedestrians. Other companies are working on Level 3 and Level 4 vehicles that take more control out of the driver’s hands, which will be released as finished products.

In spite of the limited situations where it can be used, and the responsibility of the driver to stay alert, Tesla has talked up Autopilot as extremely advanced. Last month, its chief executive Elon Musk went so far as saying that Autopilot was probably better than humans at driving.

In the news release, Tesla backpedaled slightly from its praise of the Autopilot system, saying that the technology was still a test feature in the beta phase. The company said that the dashboard display reminds drivers to keep their hands on the wheel before Autopilot starts, and then the car makes checks to ensure that drivers are keeping their hands on the wheel. 

“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” the company said.

The accident might further dampen the idea of autonomous cars for average drivers, who are at least ambivalent about automated driving, and at worst fiercely opposed to it. In a recent survey from the University of Michigan, around 46% of drivers said that they would not want any autonomous driving mode in their vehicle. And About 85% of survey respondents would prefer not to ride in fully autonomous cars that have control in all situations.

The results of the investigation might also impact National Highway Traffic and Safety Administration’s plans to release new guidelines for operating self-driving cars. The guidelines are scheduled to be released in this month.

Looking for parts? Go to SourceESB.

This article was updated on July 6th, 2016.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!