The article explains that the crash snapshot shows: - hands off wheel - autosteer had the steering wheel despite a geofence flag - no take-over warnings, despite approaching a T intersection at speed

Letting people use autopilot in unsafe conditions is contributory negligence. Given their marketing, that's more than worth 33% of the fault.

That they hid this data tells me everything I need to know about their approach to safety. Although nothing really new considering how publicly deceitful Musk is about his fancy cruise-control.

When you put your foot on the accelerator while on autopilot a warning pops up saying the car will not brake.

From the article, the data Tesla withheld from the court revealed:

* There was no record of a “Take Over Immediately” alert, despite approaching a T-intersection with a stationary vehicle in its path.

* Moore found logs showing Tesla systems were capable of issuing such warnings, but did not in this case.

Everytime you put your foot on the accelerator while the system is engaged a warning pops up saying the car will not brake.

It happens right away and has nothing to do with any other warnings. If you own a Tesla you have seen this warning over and over.

[flagged]

No aircraft manufacturer is misleading the pilots around their automation capabilities.* Pilots have to go through extensive training, including type certification on the plane they will fly. They are acutely aware of limitations. When anything does go wrong, there is a thorough and public postmortem, with legally binding findings and mitigations.

For Tesla, anyone that can read the one sentence description of enabling autopilot (double tap the stalk) can use it in any condition, without any special training, and the company will stonewall any accident investigation. The entire term "full self driving" is just "puffery".

* Yes, MCAS, but this is not an autopilot/autodrive system, and Boeing is in trouble for this.

I mean, to get a driver's licence, you also have to go through an extensive training and pass a test, including a test on the type of car you drive.

If you're involved in a crash, you're also held legally responsible.

The nagging if you're not paying attention during Autopilot or FSD, came a long way from 2019, too.

Why aren't the cruise-control vendors fined with quarter-billion dollar fines for anyone who mistook the marketing of the feature as controlling the cruise, e.g., because the name was misleading?

Cruise control, it means, I just press the button, and relax, right? That's what the sales person told me! Why did the car veer off the highway? Why are they calling it cruise control when it doesn't control the cruise? There should have been a different name for this super-misleading feature!

Hah, comparing the training of an airline pilot to that of a driver. Thanks for the laugh.

I'll give you one thing: it's a great example of the continuum fallacy. I'm definitely going to use it in the future to help explain the fallacy. Thanks a bunch.

Not even slightly comparable to the training pilots receive.

> I mean, to get a driver's licence, you also have to go through an extensive training and pass a test, including a test on the type of car you drive.

Not necessarily. There are still people driving who essentially bought their drivers license without any formal training.

What do you think about the Wikipedia article titled "False equivalence" that describes the logical fallacy?

https://en.wikipedia.org/wiki/False_equivalence

How exactly does it apply to autopilot, though?

The false equivalence is more akin to the "auto" of "autopilot" implying that the driver doesn't have to supervise, yet such connotation is never presented as a fault for the original system for which the name was initially devised for.