Tesla’s ‘autopilot’ crash is further proof technology flaws

This week’s question comes from Marianne K. in Pacific Heights, who asks:

Q: “I read your column last week about the first Uber robot-car pedestrian death in Arizona. Then, the next day, there was a fatal crash involving a Tesla on ‘autopilot.’ What is going on? Who is responsible when a car on autopilot crashes?”

A: This series of fatal tragedies is a huge wake-up call. The technology is not yet ready for mainstream introduction into our roads and highways. Driving is a complex, dynamic activity that utilizes all of our faculties. While carefully developed autonomous vehicles of the future may increase safety, current automated driving features may not be as safe as touted.

Sadly, this is not the first autopilot death. In May 2016, Joshua Brown, a former Navy SEAL, died near Williston, Fla., when his Tesla Model S collided with a tractor-trailer while it was engaged in the “autopilot” mode. The crash happened as the truck made a left turn across his path as he was going 74 mph.

The Model S, which was relying heavily on cameras in its operation, did not recognize the trailer as it was white and cast across an overcast white sky. A major National Highway Traffic Safety Board investigation ensued resulting in the June 2017 issuance of a 500-page report on the crash. The NTSB found no system failures and reported that during a 37-minute period of the trip when Brown was required to have his hands on the wheel, he apparently did so for just 25 seconds. While dispelling the urban myth that Brown was watching a Harry Potter video on the Model S control panel, the report found Brown was given a visual warning seven separate times that said, “Hands Required Not Detected,” for one to three seconds.

In September 2016, before the NTSB report came out, Tesla, having reviewed the data from the Brown crash, announced: “improvements” in its autopilot system adding new “restrictions” on the hands-off driving features and improvements in its use of radar that its chief executive officer said likely would have prevented the crash. The updated system was said to temporarily block drivers from using Autopilot if they did not respond to audible warnings to take back manual control. Now, it seems that the improvements were not enough to prevent another death.

Tesla has been criticized for its branding of the driver assistance package as an “autopilot.” Both Consumer Reports and the German government asked Tesla to stop using the autopilot moniker, as it portrayed a false sense of security leading to driver inattention and abdication of operational control. Indeed, German transport minister Alexander Dobrindt asked Tesla to drop the term “autopilot,” arguing it can lead consumers to think the car had greater abilities than it did.

Tesla refused, stating, “Just as in an airplane, when used properly, Autopilot reduces driver workload and provides an added layer of safety when compared to purely manual driving.” While the mechanics of Tesla’s response may be true, they dodge the reality: Drivers are treating the feature as if it was an autopilot that can assume control while the driver “multitasks.”

Tesla is already on the defense (employing a strong offense) releasing information from the data it records saying the driver Wei Huang should have had about five seconds, and 500 feet of unobstructed view of the concrete barrier, which he struck, before the crash. According to Tesla, Huang did not have his hands on the wheel for six seconds prior to the impact. Earlier in the drive, Tesla reports that Huang had been given multiple visual warnings and one audible warning to put his hands back on the wheel.

Tesla also released photos of a missing crash absorbing barrier that could have attenuated the impact. It seems the barrier may have been removed following another crash and had yet to be replaced. In short, Tesla is saying it was either Huang’s fault or Caltrans’ fault.

Legally, the analysis will examine the relative fault associated with the collision. What is Tesla’s responsibility for its autopilot failing to merge to the right or left at the “Y” in the highway and, instead, heading into a concrete wall faced with a reflective panel? Was it the reflective material used in the warning sign that may have interfered with Tesla’s emphasis on the use of radar following the Brown fatality in Florida? Was there a problem with the mapping software that may have been thrown off by a change in conditions associated with the barrier’s removal?

Likewise, Huang’s conduct will be evaluated: Was he unreasonable in his reliance of the system touted as an autopilot? Finally, what is CalTrans’ responsibility, if any, if it failed to replace a crash absorption system? Did their failure to act create a “dangerous condition of public property?”

As the technology is still in its infancy, it is unclear who will bear responsibility. One thing is clear though, the courts will take a pivotal role in apportioning responsibility. Let’s just hope that not too many more people have to die before the courts or legislature work out the regulatory bugs.

Christopher B. Dolan is owner of the Dolan Law Firm. Email questions to help@dolanlawfirm.com.

Related Stories