The Fuse

Tesla Accident Shows Leadership Needed in Deployment of Autonomous Vehicle Technology

by Matt Piotrowski | July 08, 2016

The Tesla accident that resulted in a tragic death of the driver has raised a number of questions about the safety and future of autonomous vehicles.

Given the enormous anticipated benefits of autonomous vehicles, the crash, which occurred in May in Florida, underscores the need to redouble efforts on safety so the incident does not derail efforts to bring about their deployment. Just as important is the necessity of automakers, tech companies, and regulators in educating the public about new technology as it emerges and the economy transitions to adopt autonomous vehicles on a large scale.

Given the enormous anticipated benefits of autonomous vehicles, the Tesla crash underscores the need to redouble efforts on safety so the incident does not derail efforts to bring about their deployment.

The Model S accident happened when the car was on Autopilot, a system that controls the vehicle without the help of a human driver. When a vehicle has semi-autonomous features such as Autopilot, the car is not fully self-driving—the driver still has some responsibility and will eventually have to take over.

“This incident can either put a pause on the rollout of autonomy, or it will push regulators to act faster so these types of accidents aren’t commonplace,” Michelle Krebs of Autotrader told The Fuse.

“This accident is very tragic, but we don’t believe it will set back technology in a precipitous manner,” Scott Corwin, a managing director with Deloitte Consulting, told The Fuse. “The relatively limited number of accidents to date suggests that the technology for self-driving cars does work—and it will continue to be improved upon. But no one claimed that it will ever be infallible.”

The National Highway Traffic Safety Administration (NHTSA) is now investigating this accident. The investigation results will be scrutinized closely to see whether the accident was the direct result of the Autopilot function failing or driver error—or a combination of both.

“The hope is that this tragic incident just increases the determination by all those involved to keep trying to improve the technology,” said Corwin.

News reports have also surfaced that a Tesla Model X was on Autopilot when it crashed on the Pennsylvania Turnpike last week, although there were no fatalities. Tesla has said no data has yet confirmed that Autopilot was on at the time of the accident. Nonetheless, the crash will only further fuel skepticism of semi-autonomous features. NHTSA has also launched an investigation into this incident.

The safety agency is also expected to issue guidance soon on regulating autonomous vehicles, which will come under heavy examination.

During the rollout of any new technology, there will always be setbacks, some of which may be fatal. Going forward, it’s crucial that the right lessons come from the Tesla accident, particularly that deployment should be done with caution and driver education.

“Clearly, the public needs to understand the different levels of autonomy,” said Krebs.

Deloitte’s Corwin said: “There needs to be more on-road testing of the technology. If this leads to increased regulatory restrictions to on-road testing, it could delay advancements in technology and safety.”

There are four levels of autonomy, with level 4 being fully self-driving. Level 4 vehicles are still in their testing phase and won’t be available on a mass scale for some time. During the transition phase, semi-autonomous features such as Autopilot become more commonplace.

“There needs to be more on-road testing of the technology. If this leads to increased regulatory restrictions to on-road testing, it could delay advancements in technology and safety.”

Many will put the blame of the incident on Tesla, saying the company has not done enough testing of its semi-autonomous vehicles and training drivers before putting them on the market. Even before this accident occurred, the company had many critics. “Tesla is being reckless,” said Don Norman, the director of the Design Lab at University of California, San Diego. “From what I can tell, Tesla has no understanding of how real drivers operate and they do not understand the need for careful testing. So they release, and then they have to pull back.”

Meanwhile, critics of autonomy will likely pounce on this incident to argue that autonomous technology overall is overhyped and is too dangerous to deploy.

It’s too soon to know what the public’s perception of self-driving cars will be in the aftermath of the Tesla accident, but more and more studies show the overall benefits of the new technology. Vehicle innovation promises to bring about a shared economy and mobility to many who don’t have access to transportation. A recent study, conducted in Ann Arbor Michigan, said that 200,000 personal vehicles could be replaced by just 18,000 shared, connected, autonomous vehicles. This type of shift will naturally reduce congestion and fuel consumption.

It’s critical that negative occurrences with autonomous vehicles, which are inevitable, are learning experiences until the full scope of benefits are realized.

The biggest benefit, however, will come with regards to safety. The news of the Tesla accident came at the same time NHTSA said that human error is to blame for some 94 percent of the U.S.’ 6 million crashes that occur every year. According to preliminary data, roughly 35,200 people died last year in traffic accidents, up almost 8 percent from the previous year. The total annual social and economic costs of accidents on the road are in excess of $836 billion.

nhtsa

Tesla’s early-adopter customers have been very enthusiastic about taking risks with Autopilot. But the question has always been whether untrained, non-professional drivers were ready to take control when necessary. Chunka Mui, writing in Forbes in April, asked an important question about how drivers use the new feature, highlighting the dangers of the new technology: “Will they follow safety guidelines and use the Autopilot only under recommended conditions, or will they push the limits as their confidence grows?”

The latest incident, though, goes beyond Tesla. How will regulators, carmakers and consumers deal with inevitable setbacks until we get to a point where vehicle accidents are increasingly rare? It’s critical that negative occurrences with autonomous vehicles, which are inevitable, are learning experiences until the full scope of benefits are realized.

ADD A COMMENT