> Thus, the constant refrain that "autopilot in cars is just like autopilot in planes" actually supports the case against Tesla.

Just like, holds a lot of weight. I'm saying autopilot has a meaning in the world of aircraft and the FAA has some guidance on how it's used. They still place all responsibility on the pilot. So in that sense they are similar.

It's not that I think automakers shouldn't be liable for misleading marketing, it's that in this case I don't think the argument is strong.

> Thus, it's irrelevant that the small print

The driver has to agree to understanding how it works before using the feature. In the manual it's called out in the same way my Subaru calls out warnings for eyesight. In the model S 2019 manual, the car in this accident, it's clearly labeled with multiple warnings and symbols. Saying it's small print is disengenous. Half of the manual in that section is warnings about the limitations of the driver assistance tech.

The fact that it's in in the instruction manual is irrelevant because consumers are not required to read the instruction manual before using the car. And either way, the manual conflicts with the marketing of the car, and with the CEO's many statements saying that the car could drive itself.

A consumer is not expected, nor required, to resolve this conflict that the company created for itself through its own choice of conflicting language. Tesla was able to get away with it longer than expected, but now the floodgates are open to reason again.

> consumers are not required to read the instruction manual before using the car.

Indeed, and perhaps this is part of the problem. A reasonable person would find that through licensure, there is a expectation that you know how to operate and take responsibility for the operation of the death machine you step into.

If Tesla's marketing is so dangerous why hasn't the FTC acted even once? FTC has been pinged by the NHTSA and Congress. At least the first time was before this accident. It took years for NHTSA to implement the nag. NHTSA could have recalled autopilot before this accident happened.

Tesla did not get away with anything. The agencies failed to address it in a self certification model set by them. It's their job to ensure drivers are safe. Meanwhile Germany and the California DMV did do something. If Tesla is to blame, so are the FTC and NHTSA.