BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Lawsuit Over Tesla Autopilot Fatality Unlikely To Win But It Uncovers Real Issues

Following
This article is more than 5 years old.

NTSB: S Engelman

A lawsuit against Tesla for the 2018 Model X Autopilot fatality was filed May 1 by the family Walter Huang, the Tesla owner who died in the crash. In this crash, the Model X, driven in autopilot, veered left out of its lane, sped up and drove straight into the crash attenuator barrier at the left-side carpool offramp from US 101 to 85 in Mountain View. The crash attenuator (designed to crumple and absorb accident energy) had been already crushed in another accident 11 days before, making things far worse.

Huang had complained that Autopilot was not working well at this off-ramp. At the off-ramp, the road lines form a triangle, called the "gore" as the divider between the leftmost and adjacent lanes forks into two and becomes solid. As the two lines spread apart they soon are as far apart as a car lane. A bit further, they end at the crash attenuator as the physical barrier between the two roads begins.

Back then, the lines of the gore were poorly painted, especially on the right (Highway 101) side. A likely explanation presents itself: The lane line finding system in the Tesla decided the two diverging lines were actually a lane, and due to the wearing out of the right hand line, that the left hand line was the boundary of the lane it should be driving in. Tesla has not provided any confirmation or refutation of this hypothesis. The car then attempted to follow this "lane" -- right into the crash attenuator.

Here is the NTSB preliminary report on the incident. The full report is not out. Tesla was participating in the investigation but the NTSB took the unusual step of removing it as a participating party when it started making statements about the incident in public, a no-no under NTSB rules.

The plaintiffs, Huang's surviving family, try to assert that the Tesla and its autopilot system are "defective" because the car veered out of its lane, accelerated and drove straight into the crash absorber. While that's obviously something you would not want a car to do, all of this is consistent with the state of the art on ADAS systems today.

Fortunately for Tesla, my opinion is the lawsuit is not very well formed.  To win they need to show that the system is defective (does not perform as designed and warranted) or failure to warn of such defects or of problems with the design.   That the car will crash if not properly supervised is an explicit, and well warned part of the design.   The case contains a number of unlikely claims.

It asserts that a non-defective emergency braking system "does not allow a crash to occur" and seems to feel such systems will not ever hit anything or will always at least try to avoid them. That's certainly not the case for any such system sold by any vendor, nor do vendors warrant that it is. It claims that a non-defective lane-keep system would always prevent the vehicle from driving outside of lanes, which is also not at all the case with any such system on the market today.  Such a ruling could shut down all the world's collision avoidance systems, though they are saving many lives.

The lawsuit even mentions that other cars have automatic emergency braking, which they do -- but all of those car systems are imperfect and will not prevent all crashes. It claims that later Teslas sold after the accident were better (that's true) but these systems were and are still imperfect, and sold as not perfect.

They assert a special flaw in the un-commanded acceleration. The Tesla did indeed accelerate as it moved into the gore. This is what Tesla's cruise control does if it is following a car going below its set speed, and things change so that it no longer sees a car in front of it. At that time, it accelerates up to its set speed. Drivers actually can find this disconcerting if they change lanes and suddenly find themselves going much faster after a long period of following slower cars. However, the car goes at the speed set by the driver, though that driver may have set it some time ago, resulting in surprise.

The lawsuit claims that because, after the accident, Tesla released software updates with better emergency braking and lane following, as well as lane changing and exit-taking, and such features might have (they say would have) prevented the accident, this argues Tesla is liable.

Tesla Autopilot is a driver assist system known to make such mistakes. As such all users are warned to be constantly vigilant and to take control if something like this happens. They are warned that the system is in beta test mode. They are warned to keep their eyes on the road and their hands on the wheel. If they don't keep applying modest force to the wheel to show that they are grasping it, a series of warnings appear -- at first a subtle visual clue, then a stronger one, and then an audible alarm.  While the lawsuit may have some success accusing Tesla of overhyping their product, the reality is when it gets down to actually buying and using it, they go to real effort to remind people that it is only driver assist.

Tesla will no doubt use those facts in its defense of this lawsuit. The driver was aware that this off-ramp was a problem and Tesla has stated that knowing that, he should have never been turning it on for this stretch of road. The driver was apparently fairly aware of how Autopilot works and the risks involved.

Tesla maintains that Autopilot is indeed a beta-test product which is incomplete and requires supervision. Like regular cruise control, you must monitor it and correct any mistakes it makes. It just makes a lot fewer mistakes than most existing cruise control and lane-keeping products on the market, which Tesla says is a good thing, not a bad one.  (It isn't really a beta test product, of course.  It's sold for money to almost all customers.   Silicon Valley has adapted the habit of declaring all software products in perpetual beta test.  Rather, Tesla's declaration of beta test should be read as a stronger declaration of "don't expect this to work all the time.")

As such, I suspect Tesla is likely to prevail in this lawsuit. If it's even true that Huang, a  software engineer and game developer who seemed reasonably familiar with Autopilot, believed it had a magic ability to never crash, I am not sure how they will be able to prove that.

The family is also suing the state of California because the crash attenuator was crumpled by an earlier accident and not fixed. Had it been restored, Huang might well have lived. It's an open question what duty of care the state has to repair these barriers quickly, but that's beyond the scope of this article.

Tesla

In a probably unrelated note, Tesla released a software update May 2 which adds lane departure avoidance features to Tesla cars. They will now notice if you are drifting out of your lane, especially if you're not holding the wheel, and bring you back -- if you didn't signal a turn. It starts with a warning function that will beep at you and slow you down. If you actually look like you might hit another car or go off the road, it will actually steer the car back into the lane to save you. That would not have affected the accident in question, but it sounds like a feature that's good for all except those who try to change lanes by taking their hands off the wheel and letting the car drift to the other lane, all without using the turn signal. Other cars have this function, but I suspect Tesla's will be better than the rest.

What fault with Tesla?

There are things Tesla could have done better -- and indeed they have changed some of them since the accident. This is part of the normal process of product improvement in a situation like this. Some they still could improve:

  1. In a situation where the lines get this confusing, the car should probably prefer conservative choices (tracking the line on the right) rather than trying to change in a new ill-defined lane that suddenly appears.
  2. The way that most adaptive cruise controls (including Tesla's) can speed up suddenly when the lane in front of them clears should be much more gradual. Even though the driver set that high speed, they may have set it some time ago. It's possible the accelerator pedal should need to be used to restore to an old, higher speed.
  3. The road geometry should be stored in a map, so that the car knows that only the far left line bends off and the 2nd from left lane (which the car was driving in) goes straight. At this point, and all other points on roads where lanes diverge.
  4. It's a bit surprising to me that since this location is just a few miles from Tesla HQ that they didn't spot the difficulty it had here earlier, especially when Huang pointed it out to them.
  5. Tesla needs better monitoring of the driver (see below.)

What's surprisingly not present directly in the lawsuit is the big issue that has been discussed in the industry -- whether having a very good driver assist system lulls even aware drivers into a false sense of security, and whether the sellers of such "too good" systems should face any responsibility for that, even if they take lots of reasonable steps to inform drivers of the realities of their system.

The plaintiffs touch briefly on this, referring to the bravado in Tesla's promotional materials. Tesla certainly does say that full self-driving functionality is just around the corner. This does make it easier for people to confuse the Tesla with a real self-driving car. Whether Huang did or not is another question. It is valid to assert that in spite of Tesla's warnings that the current system is not a self-driving one, people are treating it like one. And it's also valid to assert that Tesla knows that people are doing this.    There are people who, in spite of the warnings, are not getting the message, or who are getting it and acting foolishly, treating Tesla Autopilot like a true robocar system.

This has indeed been a subject of public debate over what might be called the paradox of driver assist. You want a driver assist system to be very good, but the better it gets, the easier it is to start forgetting that it has flaws, the easier it is to treat it like a full self-driving system. While you would never consider taking your attention from the road with a basic dumb cruise control, it's easier to consider doing that with adaptive cruise control, or with a collision avoidance system. The better it is, the greater the risk of bad supervision becomes. At the same time, there is a strong instinct to not want to say that products are less safe the better you make them.

This was debated in Tesla's first Autopilot fatality, and NHTSA and NTSB released favorable reports for Tesla which go against the logic in this lawsuit. In addition, numerical analysis, much touted by Tesla, suggested that the "paradox of driver assist" described above was not happening -- that drivers using autopilot had an overall better safety record than drivers who were not using it. These numbers have recently been disputed and that dispute has been disputed by Tesla. My analysis of this is pending.

Tesla's theory about the paradox is nonetheless certainly plausible, even if it isn't yet true. It can be the case that if, say, 90% of drivers use Autopilot correctly, supervising it well, and are 50% safer doing so, and 10% of drivers use it badly, becoming twice as dangerous, the overall result is still safer. The moral question becomes whether the danger from and for those 10% of drivers is so frightening that the additional safety should be forbidden. I would say no.

Driver monitoring and warning

The NTSB preliminary report, and statements from Tesla, make a lot of the fact that Huang had received two visual alerts and one auditory alert from Autopilot about holding the wheel in the 19 minutes prior to the crash, and that "for the last 6 seconds prior to the crash, the vehicle did not detect the driver's hands on the steering wheel."

For those who have not driven with Tesla Autopilot, that may sound bad. In fact, based on my Model 3 experience, this is entirely normal in the proper use of Autopilot. The only way Tesla detects hands on the wheel is by noticing torque on the wheel. It is not sufficient to be holding the wheel -- you must apply some noticeable steering force; enough for it to notice, but not enough to make it disengage. Tesla drivers try to learn the art of holding the wheel just right, applying a modest pressure so that it will get noticed. Many drivers "phantom steer" in their heads, commanding their hands to turn the wheel as they would if they were driving, but lowering the force used to a minimal amount. Even so, it is quite easy to get the visual and even the audible alerts if you just grasp the wheel but don't apply this pressure for too long. That Huang got 3 notifications in 19 minutes is not evidence he did not have his hands on the wheel and was not using the system correctly. He made 3 torques on the wheel in his last minute, which also seems within a reasonable range on a straight stretch of highway.

The lack of torque in his final 6 seconds suggests a different story. If he had been properly watching the road, he should have seen his car veer to the left, and turned the wheel hard enough to take control and return to his lane. He didn't. We may not know the reason for that. It's particularly odd since he had encountered this problem before at this off-ramp and was worried about it. It was a fatal error. People take their eyes off the road for short periods all the time, both safely and unsafely. They even take their hands off the wheel for short periods on straight stretches of road in cars that have no auto-steering function whatsoever. Each time people do this, they create a risk; most of the time they get away with it. Huang did not.

Tesla could use different methods to decide if a driver is paying attention to the road or has her hands on the wheel. Some automakers are using a camera that looks at the driver's eyes -- Tesla has a small interior camera but does not use it at this time. The wheel could be designed to register grasp in addition to torque, though it is valid to claim that torque indicates more engagement than mere grasp does.

Disclosure: The author bought a small number of Tesla shares on the dip around their April 2019 earnings report. He likes to buy on dips.

Leave a comment at this site.