Tesla’s Autopilot Steers Company Into Legal, Financial Peril

I think Tesla is in trouble.

The company has permitted market confusion about what its Autopilot feature can and cannot do. Following the death of a Tesla owner who perished when his car crashed while in Autopilot mode, this confusion gives lawyers the grist they need to take Tesla to the cleaners.

It’s not hard to construct a scenario in which Tesla could face a $100 million liability as a result. With only about $1.2 billion in cash as of the end of 2015 and an annual burn rate of $1 billion, the company is ill-suited to absorb a hit of this magnitude.

The company’s risk comes in three flavors.

  • If the deceased family files a lawsuit, I’d expect it to be in the neighborhood of $100 million. That’s based on other wrongful death claims targeting car companies. Settlements involving auto manufacturers typically are confidential, but let’s say that the two sides can agree on $10 million.

  • A class action lawsuit would require perhaps several hundred Tesla owners banding together to claim that the company duped them about the Autopilot feature (more on this below), but a settlement potentially could impact everyone who has an Autopilot-enabled Tesla. Autopilot is an optional feature that costs between $3,000 and $4,250. Let’s say that a settlement calls for Tesla to pay each owner $1,000. With 70,000 Autopilot-enabled cars, that amounts to $70 million.

  • If regulators determine that Tesla misled consumers, released a feature it knew to be flawed, or inadequately notified consumers about potential problems or risks, the company could face a fine. GM was fined $35 million for not recalling vehicles affected by its ignition switch problems promptly enough. I’m using this example to estimate Tesla’s exposure because the ratio of GM drivers who suffered death or injury (as determined by Kenneth Feinberg, who administered GM’s settlement fund) to the number of vehicles affected is nearly identical to Tesla’s. Let’s say that Tesla’s fine comes in at “just” $20 million.

There’s $100 million.

Most of this downside will depend on the extent to which plaintiffs’ attorneys and regulators can prove that the company misled customers. While that would require access to company documents and emails, it’s downright easy to show that the market is confused about what the feature can and cannot do.

The problem starts with the term itself. “Autopilot” conjures images of hands-free driving, of a machine operating on its own. In an interview last year with Bloomberg, Tesla CEO Elon Musk says that Autopilot does not mean autonomous. More importantly, he pledged that “We’re going to be quite clear with customers that the responsibility remains with the driver.”

Tesla’s actual communications on the topic haven’t lived up to that pledge.

Here’s what Tesla’s press kit on its latest software release says (emphasis is mine): “The latest software update, 7.0 allows Model S to use its unique combination of cameras, radar, ultrasonic sensors and data to automatically steer down the highway, change lanes, and adjust speed in response to traffic.”

Tesla’s consumer-facing website has this to say about the feature (again emphasis is mine): “Autopilot allows Model S to steer within a lane, change lanes with the simple tap of a turn signal, and manage speed by using active, traffic-aware cruise control. Digital control of motors, brakes, and steering helps avoid collisions from the front and sides, and prevents the car from wandering off the road.”

Only once on either of these sites does Tesla mention driver responsibility, but it’s not in the context of Autopilot, and it’s not on the consumer site. The press page includes this disclaimer: “Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.”

Not surprisingly, then, YouTube is filled with Tesla drivers recording themselves with their hands off the wheel. Some of them even acknowledge their understanding that Autopilot does not mean “self-driving,” but their behavior – going hands-free – suggests that they don’t see a distinction.

Even the press is confused.

  • This article in The Verge considers the distinction between self-driving capabilities and Autopilot, but the sub-headline refers to 100 million miles logged on Autopilot. It reads, “That’s a lot of self-driven miles.”

  • Engadget’s treatment of Tesla’s announcement includes this headline: “Tesla’s Model S is getting a self-driving ‘autopilot’ mode in three months.”

Neither story contains a retraction or an official comment from Tesla.

If Tesla really wanted to be ‘quite clear’ with consumers, we’d see the company actively commenting on any erroneous content. We’d see Tesla imploring drivers to use Autopilot as prescribed. After all, the accident proves that hands-free driving a Tesla in Autopilot represents a potentially deadly threat to the driver, passengers, and every other vehicle in the vicinity.

Yet I cannot find a single instance in which Tesla commented on any of these videos. The company’s own YouTube channel includes a video that mentions the need for drivers to keep their hands on the wheel, even during Autopilot operation. It then undermines that message in a video showing a completely driverless car exiting and re-entering a garage.

Tesla’s Facebook and Twitter pages similarly do not include any videos that emphasize the need to keep both hands on the wheel.

In other words, Tesla does not appear to have been clear with its customers, the press, and the rest of the driving public that shares the roads with at least some who think it’s okay to take their hands completely off the wheel while letting Tesla “autopilot” itself. (Search for images of “Tesla autopilot” and you see just how common this is.)

This is why I believe the company is in significant peril.

 

2 comments

  1. No way in hell would I buy, drive or be a passenger in a car like this. There are just too many ways things can go wrong. Or perhaps I saw too many Sci Fi movies when I was a kid about robots going crazy. I just don’t see how I could trust a machine + computer to do my job for me in this case. Why do we want “self-driving” cars in the first place? Also, Matt, in your first bulleted paragraph, was it an individual or a family that was killed in the accident?

    Like

  2. Thanks, Will. Tesla vehicles inspire passions on both ends of the spectrum. I’m with you, and not just because Tesla’s cars aren’t ready and aren’t being marketed accurately. As a recent article in the WSJ points out (http://www.wsj.com/articles/roads-that-work-for-self-driving-cars-1467991339), our roads aren’t ready for anything approximating “self-driving.” To answer your question, the accident in question killed the Tesla’s lone occupant (NYT story here: http://www.nytimes.com/2016/07/11/business/fatal-tesla-crash-draws-in-transportation-safety-board.html?_r=0). It also involved a truck, and I can only imagine how the driver of that vehicle is coping. That’s why Tesla’s system is a hazard to more than just those who own the car.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: