I might challenge with "autopilot is cruise control." To me, Tesla is marketing the feature much differently. Either way, looking up the definitions of each:

"Auto Pilot: a device for keeping an aircraft or other vehicle on a set course without the intervention of the pilot."

"Cruise Control: an electronic device in a motor vehicle that can be switched on to maintain a selected constant speed without the use of the accelerator."

> "Auto Pilot: a device for keeping an aircraft or other vehicle on a set course without the intervention of the pilot."

All an auto pilot on an aircraft does is keep the plane flying in a straight line at a constant speed. It mostly doesn't do obstacle avoidance, or really anything else. Yes, you don't need intervention of the pilot, because it turns out going in a straight line in an airplane is pretty hard to screw up.

From that standard at least, modern cruise controls are more capable than airplane auto pilots. There is a widespread belief on HN, however, that people are generally very dumb and will mistake autopilot for something more like FSD.

    There is a widespread belief on HN, however, that people are generally very dumb and will mistake autopilot for something more like FSD.
I think the error here is that you're underestimating just how rare accidents are. Let's imagine there's some monstrously dangerous feature X that results in fatal collisions 10% of the time when misused. If we assume a person either uses it correctly or consistently misuses it, how many people (1 in N) need to be misusers to double the US fatality numbers?

You only need about 1 misuser in every 500-2,000 drivers, depending on how you do the numbers. Now obviously autopilot isn't as dangerous as our hypothetical feature X here, but do you think it's reasonable to argue that a small fraction of a percent of autopilot users might be misled about its capabilities by the name? I think that's a long way from saying "people are generally very dumb".

I don’t think they were misled by the name. I dislike musk and Tesla, but the use of autopilot to describe their cruise control is one of those things where they are being fairly pedantic, and, out of character for them, actually technically correct. Anyone who knows what an autopilot does won’t be misled, anyone whose only experience with the term is from the 1979 movie Airplane! (Otto Pilot anyone?) also isn’t going to be misled. Then what we basically have left are HN pedants who themselves personally got the term wrong and didn’t do any due diligence.

My question is whether you believe the term is so wildly obvious that questioning whether a tiny fraction of a percent of drivers misunderstand it is completely unreasonable. It doesn't rely on Tesla having misused the term at all and for the record, I don't think they are.

But, I don't think it's unreasonable that some of the 5% of US adults who have never been on a plane might not understand what autopilot is in aviation. I don't think it's likely that the 8.4% of US adults who score below the lowest measurable level of PIAAC literacy have a good understanding of the warning messages when you enable Tesla's L2 features, or are digging through the owner's manual to understand them. It seems unlikely that the 3% of adults with <70% IQs are reasoning out the limitations of the system from technical definitions. Hopefully the idea is obvious here. You only need one person out of thousands to make a massively dangerous system. I don't think it's an obviously ridiculous argument that one person out of thousands doesn't fully understand and consider the complicated limitations of such a system.

It IS fancy cruise control.

That is not how it’s marketed at all.

You are right, but unfortunately you are the least useful right, which is technical right.

That is definitely what auto pilot means in the aeronautical and maritime sphere.

But a lot of the general public has a murky understanding of how an auto pilot on a ship or a plane works. So for a lot, probably the majority of them. They will look at the meaning of those two words and land on that auto pilot, means automatic pilot. Which basically ends up beeing self driving.

Sure in a perfect world, they would look up what the term means in the sphere they do not know, and use it correctly, but that is not the world we live in. We do not get the general public, we want, but we have to live with the one we got.

You are also merely technically right. It would require an intentional suspension of one's theory of mind to not recognize the extent to how Tesla's own marketing of its products, and its determination to hide pertinent information in cases like this, is intended to perpetuate the popular misconception of its capabilities.

> ...we have to live with the [the world] we got.

There was nothing inevitable in how we reached this situation, and no reason to let it continue.

In both cases, they are driver assistance. A pilot is responsible and must monitor an autopilot system in a plane. We license drivers and pilots and the responsibility is placed on them to understand the technology before using it and putting themselves and others at risk.

Would Boeing or John Deere be responsible for marketing language or just the instruction manual. We know the latter is true. It's there any evidence of the former? Intuitively I would say it's unlikely we'd blame Boeing if a pilot was mislead by marketing materials. Maybe that has happened but I haven't found anything of that sort (please share if aware).

The difference is in the sheer amount of training pilots have to go through, and the regulations that they, and their employers, are required to follow. This is tremendously different from a car that throws up a couple of warnings that can be quickly and passively acknowledged prior to your using "autopilot".

You can't passively acknowledge it. It requires actively agreeing that you understand prior to use. The dialog in one version makes reference to autopilot in the pilot definition.

Maybe when this accident happened it was different, but as far as I know it's always been behind a confirmation dialog.

To operate a motor vehicle in the US, you must be licensed. That surely holds some weight here.

While technically an autopilot might sometimes be as simple as maintaining a heading, the actual practical consequence is quite different for a boat or an aircraft than for a car. There is simply not much to crash into when you're in the air or open water. The result is that a much simpler mechanism is required to achieve the same result for the pilot.

When I worked on unmanned vehicles, you could have one operator control multiple speedboats because you typically had minutes to avoid collisions. Splitting attention would not be feasible with a car on cruise control, because you are never more than a few seconds away from crashing into something solid.

Would Boeing or John Deere be responsible for marketing language or just the instruction manual. We know the latter is true

Actually, the former is true. Courts and juries have repeatedly held that companies can be held responsible for marketing language. They are also responsible for the contents of their instruction manual. If there are inconsistencies with the marketing language it will be held against the company because users aren't expected to be able to reconcile the inconsistencies; that's the company's job. Thus, it's irrelevant that the small print in the instruction manual says something completely different from what all the marketing (and the CEO himself) says.

The "autopilot is limited" argument would have worked 20 years ago. It doesn't today. Modern autopilots are capable of maintaining speed, heading, takeoff, and landing so they're not just pilot assistance. They're literally fully capable of handling the flight from start to finish. Thus, the constant refrain that "autopilot in cars is just like autopilot in planes" actually supports the case against Tesla.

> Thus, the constant refrain that "autopilot in cars is just like autopilot in planes" actually supports the case against Tesla.

Just like, holds a lot of weight. I'm saying autopilot has a meaning in the world of aircraft and the FAA has some guidance on how it's used. They still place all responsibility on the pilot. So in that sense they are similar.

It's not that I think automakers shouldn't be liable for misleading marketing, it's that in this case I don't think the argument is strong.

> Thus, it's irrelevant that the small print

The driver has to agree to understanding how it works before using the feature. In the manual it's called out in the same way my Subaru calls out warnings for eyesight. In the model S 2019 manual, the car in this accident, it's clearly labeled with multiple warnings and symbols. Saying it's small print is disengenous. Half of the manual in that section is warnings about the limitations of the driver assistance tech.

The fact that it's in in the instruction manual is irrelevant because consumers are not required to read the instruction manual before using the car. And either way, the manual conflicts with the marketing of the car, and with the CEO's many statements saying that the car could drive itself.

A consumer is not expected, nor required, to resolve this conflict that the company created for itself through its own choice of conflicting language. Tesla was able to get away with it longer than expected, but now the floodgates are open to reason again.

> consumers are not required to read the instruction manual before using the car.

Indeed, and perhaps this is part of the problem. A reasonable person would find that through licensure, there is a expectation that you know how to operate and take responsibility for the operation of the death machine you step into.

If Tesla's marketing is so dangerous why hasn't the FTC acted even once? FTC has been pinged by the NHTSA and Congress. At least the first time was before this accident. It took years for NHTSA to implement the nag. NHTSA could have recalled autopilot before this accident happened.

Tesla did not get away with anything. The agencies failed to address it in a self certification model set by them. It's their job to ensure drivers are safe. Meanwhile Germany and the California DMV did do something. If Tesla is to blame, so are the FTC and NHTSA.

> They're literally fully capable of handling the flight from start to finish.

I find this to be a bit of a rosy take on things.

Autopilots don't take off (which is why Airbus' ATTOL project was a notable thing when an A350 took off "autonomously" [1]). They don't handle ATC (with the tenuously arguable exception of things like Garmin's Autoland), or handle TCAS on what I'd say is a majority of airliners.

Autopilot on planes is still quite "dumb".

1- https://www.airbus.com/en/newsroom/press-releases/2020-01-ai...

[deleted]