This seems pretty dumb of Tesla, as I find it rather moot to the conclusion of fault in the accident. The obstruction of justice is damning.
Autopilot is cruise control. When you understand this, claiming that Tesla is partially at fault here does not match the existing expectations of other driver assistance tech. Just because Tesla has the capability of disabling it doesn't mean they have to.
This all comes down to an interpretation of marketing speak. If you believe "autopilot" is misleading you'd agree with the jury here, if you don't you wouldn't. I'm no lawyer, and don't know the full scope of requirements for autopilot like features, but it seems that Tesla is subject to unfair treatment here given the amount of warnings you have to completely ignore and take no responsibility for. I've never seen such clear warnings on any other car with similar capabilities. I can't help but think there's maybe some politically driven bias here and I say that as a liberal.
Happy to be convinced otherwise. I do drive a Tesla, so there's that.
Do you think Tesla spends more time and money on making their warnings convincing, or making their marketing convincing? If a person is hearing two conflicting messages from the same group of people, they'll have to pick one, and it shouldn't be surprising if they choose to believe the one that they heard first and that was designed by professionals to be persuasive.
In other words, if you bought the car because you kept hearing the company say "this thing drives itself", you're probably going to believe that over the same company putting a "keep your eyes on the road" popup on the screen.
Of course other companies have warnings that people ignore, but they don't have extremely successful marketing campaigns that encourage people to ignore those warnings. That's the difference here.
Tesla infamously doesn't have a marketing team, so that one should be easy to answer unless they rehired for that. Not sure on the latest there.
When you get your Tesla and attempt to turn on the features described, it has a dialog with a warning and you have to agree to understanding before proceeding. If you choose to ignore that, and all the other clearly marked and accessible warnings in the manual and not learn how to operate it, is that not on you the licensed driver? Isn't that what regulations and licensure are for?
I'm very much in favor of consumer protections around marketing, but in this case there were none that were clearly defined to my knowledge.
Yes of course it is on you, that's why the driver in this case was found mostly responsible. The question is whether Tesla bears any additional responsibility, not whether they are solely at fault, which they obviously aren't.
> Tesla infamously doesn't have a marketing team
Come on, obviously they do marketing. "We don't have a marketing team" is itself marketing. Here's their most popular YouTube ad, for example: https://www.youtube.com/watch?v=tlThdr3O5Qo
That video, which is called "Full Self-Driving" even though it came out when Autopilot was the only capability the cars had, was coincidentally released 3 days before the crash we're discussing. Do you think an ad like that, which simply shows the car driving itself and links to a page about Autopilot in the description, might lead someone to believe that Autopilot will do what the video shows? Again, remember that there was no separate FSD product at the time.
> Come on, obviously they do marketing.
Of course they do. You questioned time and money spent compared to warning clarity and the manual. Having no team for marketing implies little time and money spent. I would say they've spent far more money trying to make it safe and clear in the user manual than trying to mislead customers in a few tweets and a video.
> might lead someone to believe that Autopilot will do what the video shows?
It does do what it shows? The driver is attentive not placing their foot on the pedal and not taking their eyes off the road. I believe that is exactly how FSD operated until forced to implement the nag 4? years after this video was made.
Afaik this is the video congress sent to the FTC, under the awesome Lina Khan, and they chose to do nothing about it. I agree Tesla should do better, but that's different than finding them liable for wrongful death in this specific case.
Germany did something about it. California DMV did something about it. If anything it seems the NHTSA and FTC dropped the ball. That seems to be in Tesla's favor legally.
I'm reminded of Vitamin Water...
the Center for Science in the Public Interest filed a class-action lawsuit
The suit alleges that the marketing of the drink as a "healthful alternative" to soda is deceptive and in violation of Food and Drug Administration guidelines.
Coca-Cola dismissed the allegations as "ridiculous," on the grounds that "no consumer could reasonably be misled into thinking Vitaminwater was a healthy beverage"
Interesting case but I'm not sure it's apples to apples.
One, you don't need a license to buy a non alcoholic beverage. Two, while the FDA has clear guidelines around marketing and labeling, I'm not aware of any regulatory body having clear guidelines around driver assistance marketing. If they did it wouldn't be controversial.
The analogy is that marketing knows their claims are BS, but the people/courts may disagree
I might challenge with "autopilot is cruise control." To me, Tesla is marketing the feature much differently. Either way, looking up the definitions of each:
"Auto Pilot: a device for keeping an aircraft or other vehicle on a set course without the intervention of the pilot."
"Cruise Control: an electronic device in a motor vehicle that can be switched on to maintain a selected constant speed without the use of the accelerator."
> "Auto Pilot: a device for keeping an aircraft or other vehicle on a set course without the intervention of the pilot."
All an auto pilot on an aircraft does is keep the plane flying in a straight line at a constant speed. It mostly doesn't do obstacle avoidance, or really anything else. Yes, you don't need intervention of the pilot, because it turns out going in a straight line in an airplane is pretty hard to screw up.
From that standard at least, modern cruise controls are more capable than airplane auto pilots. There is a widespread belief on HN, however, that people are generally very dumb and will mistake autopilot for something more like FSD.
You only need about 1 misuser in every 500-2,000 drivers, depending on how you do the numbers. Now obviously autopilot isn't as dangerous as our hypothetical feature X here, but do you think it's reasonable to argue that a small fraction of a percent of autopilot users might be misled about its capabilities by the name? I think that's a long way from saying "people are generally very dumb".
I don’t think they were misled by the name. I dislike musk and Tesla, but the use of autopilot to describe their cruise control is one of those things where they are being fairly pedantic, and, out of character for them, actually technically correct. Anyone who knows what an autopilot does won’t be misled, anyone whose only experience with the term is from the 1979 movie Airplane! (Otto Pilot anyone?) also isn’t going to be misled. Then what we basically have left are HN pedants who themselves personally got the term wrong and didn’t do any due diligence.
My question is whether you believe the term is so wildly obvious that questioning whether a tiny fraction of a percent of drivers misunderstand it is completely unreasonable. It doesn't rely on Tesla having misused the term at all and for the record, I don't think they are.
But, I don't think it's unreasonable that some of the 5% of US adults who have never been on a plane might not understand what autopilot is in aviation. I don't think it's likely that the 8.4% of US adults who score below the lowest measurable level of PIAAC literacy have a good understanding of the warning messages when you enable Tesla's L2 features, or are digging through the owner's manual to understand them. It seems unlikely that the 3% of adults with <70% IQs are reasoning out the limitations of the system from technical definitions. Hopefully the idea is obvious here. You only need one person out of thousands to make a massively dangerous system. I don't think it's an obviously ridiculous argument that one person out of thousands doesn't fully understand and consider the complicated limitations of such a system.
It IS fancy cruise control.
That is not how it’s marketed at all.
You are right, but unfortunately you are the least useful right, which is technical right.
That is definitely what auto pilot means in the aeronautical and maritime sphere.
But a lot of the general public has a murky understanding of how an auto pilot on a ship or a plane works. So for a lot, probably the majority of them. They will look at the meaning of those two words and land on that auto pilot, means automatic pilot. Which basically ends up beeing self driving.
Sure in a perfect world, they would look up what the term means in the sphere they do not know, and use it correctly, but that is not the world we live in. We do not get the general public, we want, but we have to live with the one we got.
You are also merely technically right. It would require an intentional suspension of one's theory of mind to not recognize the extent to how Tesla's own marketing of its products, and its determination to hide pertinent information in cases like this, is intended to perpetuate the popular misconception of its capabilities.
> ...we have to live with the [the world] we got.
There was nothing inevitable in how we reached this situation, and no reason to let it continue.
In both cases, they are driver assistance. A pilot is responsible and must monitor an autopilot system in a plane. We license drivers and pilots and the responsibility is placed on them to understand the technology before using it and putting themselves and others at risk.
Would Boeing or John Deere be responsible for marketing language or just the instruction manual. We know the latter is true. It's there any evidence of the former? Intuitively I would say it's unlikely we'd blame Boeing if a pilot was mislead by marketing materials. Maybe that has happened but I haven't found anything of that sort (please share if aware).
The difference is in the sheer amount of training pilots have to go through, and the regulations that they, and their employers, are required to follow. This is tremendously different from a car that throws up a couple of warnings that can be quickly and passively acknowledged prior to your using "autopilot".
You can't passively acknowledge it. It requires actively agreeing that you understand prior to use. The dialog in one version makes reference to autopilot in the pilot definition.
Maybe when this accident happened it was different, but as far as I know it's always been behind a confirmation dialog.
To operate a motor vehicle in the US, you must be licensed. That surely holds some weight here.
While technically an autopilot might sometimes be as simple as maintaining a heading, the actual practical consequence is quite different for a boat or an aircraft than for a car. There is simply not much to crash into when you're in the air or open water. The result is that a much simpler mechanism is required to achieve the same result for the pilot.
When I worked on unmanned vehicles, you could have one operator control multiple speedboats because you typically had minutes to avoid collisions. Splitting attention would not be feasible with a car on cruise control, because you are never more than a few seconds away from crashing into something solid.
Would Boeing or John Deere be responsible for marketing language or just the instruction manual. We know the latter is true
Actually, the former is true. Courts and juries have repeatedly held that companies can be held responsible for marketing language. They are also responsible for the contents of their instruction manual. If there are inconsistencies with the marketing language it will be held against the company because users aren't expected to be able to reconcile the inconsistencies; that's the company's job. Thus, it's irrelevant that the small print in the instruction manual says something completely different from what all the marketing (and the CEO himself) says.
The "autopilot is limited" argument would have worked 20 years ago. It doesn't today. Modern autopilots are capable of maintaining speed, heading, takeoff, and landing so they're not just pilot assistance. They're literally fully capable of handling the flight from start to finish. Thus, the constant refrain that "autopilot in cars is just like autopilot in planes" actually supports the case against Tesla.
> Thus, the constant refrain that "autopilot in cars is just like autopilot in planes" actually supports the case against Tesla.
Just like, holds a lot of weight. I'm saying autopilot has a meaning in the world of aircraft and the FAA has some guidance on how it's used. They still place all responsibility on the pilot. So in that sense they are similar.
It's not that I think automakers shouldn't be liable for misleading marketing, it's that in this case I don't think the argument is strong.
> Thus, it's irrelevant that the small print
The driver has to agree to understanding how it works before using the feature. In the manual it's called out in the same way my Subaru calls out warnings for eyesight. In the model S 2019 manual, the car in this accident, it's clearly labeled with multiple warnings and symbols. Saying it's small print is disengenous. Half of the manual in that section is warnings about the limitations of the driver assistance tech.
The fact that it's in in the instruction manual is irrelevant because consumers are not required to read the instruction manual before using the car. And either way, the manual conflicts with the marketing of the car, and with the CEO's many statements saying that the car could drive itself.
A consumer is not expected, nor required, to resolve this conflict that the company created for itself through its own choice of conflicting language. Tesla was able to get away with it longer than expected, but now the floodgates are open to reason again.
> consumers are not required to read the instruction manual before using the car.
Indeed, and perhaps this is part of the problem. A reasonable person would find that through licensure, there is a expectation that you know how to operate and take responsibility for the operation of the death machine you step into.
If Tesla's marketing is so dangerous why hasn't the FTC acted even once? FTC has been pinged by the NHTSA and Congress. At least the first time was before this accident. It took years for NHTSA to implement the nag. NHTSA could have recalled autopilot before this accident happened.
Tesla did not get away with anything. The agencies failed to address it in a self certification model set by them. It's their job to ensure drivers are safe. Meanwhile Germany and the California DMV did do something. If Tesla is to blame, so are the FTC and NHTSA.
> They're literally fully capable of handling the flight from start to finish.
I find this to be a bit of a rosy take on things.
Autopilots don't take off (which is why Airbus' ATTOL project was a notable thing when an A350 took off "autonomously" [1]). They don't handle ATC (with the tenuously arguable exception of things like Garmin's Autoland), or handle TCAS on what I'd say is a majority of airliners.
Autopilot on planes is still quite "dumb".
1- https://www.airbus.com/en/newsroom/press-releases/2020-01-ai...
> given the amount of warnings you have to completely ignore and take no responsibility for.
The article says no warnings were issued before the crash.
So which warning did the driver miss?
The one you accept when you first turn it on. And the numerous ones you ignored/neglected to read when using features without understanding them.
This is the responsibility of a licensed driver. I don't know how a Mercedes works, but if I crash one because I misused a feature clearly outlined in their user manual, Mercedes is not at fault for my negligence.
If the feature you misused wasn't a part of your driver's ed class/driver's license test, and was dangerous enough to cause a crash if used improperly, perhaps Mercedes is at fault (to whatever degree) because they didn't do enough to ensure that drivers knew how to use it. Yes, technically, the driver may be at fault because, well... they're the driver, but this isn't something that is "either/or" - both can be at fault.
Drivers need to be paying attention, but is it not possible that Tesla could also do more to make things clear?
The FAA sets guidelines for autopilot, should the NHTSA not do so for ADAS and enforce that on automakers to keep everyone safe? What that looks like I'm no so sure.
I do think that needs to happen. That's pretty central to my point. Holding Tesla liable when the regulators haven't done so, seems like a unfair judgment even if Tesla and so many other automakers can and should do better. But liability should be clearer than a disagreement over the word autopilot in marketing materials.
> but it seems that Tesla is subject to unfair treatment here given the amount of warnings you have to completely ignore and take no responsibility for.
Lol is this for real? No amount of warnings can waive away their gross negligence. Also, the warnings are clearly completely meaningless because they result in nothing changing if they are ignored.
> Autopilot is cruise control
You're pointing to "warnings" while simultaneously saying this? Seems a bit lacking in self awareness to think that a warning should muster the day, but calling cruise control "autopilot" is somehow irrelevant?
> I can't help but think there's maybe some politically driven bias here
Look only to yourself, Tesla driver.
Saying Warnings are meaningless because they can be ignored would literally flip the entire legal system on its head. That is literally an insane way to think about things.
> they result in nothing changing if they are ignored.
That’s not true
> Do I still need to pay attention while using Autopilot?
> … Before enabling Autopilot, you must agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle.” Once engaged, Autopilot will also deliver an escalating series of visual and audio warnings, reminding you to place your hands on the wheel if insufficient torque is applied. If you repeatedly ignore these warnings, you will be locked out from using Autopilot during that trip.
> If you repeatedly ignore the inattentive driver warnings, Autosteer will be disengaged for that trip. If you receive several ‘Forced Autopilot Disengagements’ (three times for vehicles without a cabin camera and five times for vehicles with a cabin camera), Autosteer and all features that use Autosteer will be temporarily removed for approximately one week.
https://www.tesla.com/en_gb/support/autopilot
And you don't respond to your own point about it being called autopilot despite it not being an autopilot
>> If you repeatedly ignore the inattentive driver warnings, Autosteer will be disengaged for that trip. If you receive several ‘Forced Autopilot Disengagements’ (three times for vehicles without a cabin camera and five times for vehicles with a cabin camera), Autosteer and all features that use Autosteer will be temporarily removed for approximately one week.
There are videos of people on autopilot without their hands on the wheel...
> And you don't respond to your own point about it being called autopilot despite it not being an autopilot
I don’t follow what you mean here? Are you confusing me with someone else?
> There are videos of people on autopilot without their hands on the wheel...
You can definitely remove your hands momentarily. I’ve seen people apply a weight to the steering wheel to fool it too. Not sure how people defeating the safety features would be Tesla’s fault.
First of all I stated my bias.
What part of how autopilot is marketed do you find to be gross negligence?
I would ask, what is the existing definition of autopilot as defined by the FAA? Who is responsible when autopilot fails? That's the prior art here.
Additionally if NTSB failed to clearly define such definitions and allowments for marketing, is that the fault of Tesla or the governing body?
I'm pretty neurotic about vehicle safety and I still don't think this clearly points to Tesla as being in the wrong with how they market these features. At best it's subjective.
>What part of how autopilot is marketed do you find to be gross negligence?
The fact that it's not an autopilot is a great start.
>I would ask, what is the existing definition of autopilot as defined by the FAA? Who is responsible when autopilot fails? That's the prior art here.
I don't think the FAA defines terms, and prior art is something specific to patents that has no relevance to the worlds of marketing and product safety.
>Additionally if NTSB failed to clearly define such definitions and allowments for marketing, is that the fault of Tesla or the governing body?
NTSB does not approve of marketing nor does it provide such definitions. On what basis do you have to suggest they did any of the sort that Tesla needed their approval?
>>Additionally if NTSB failed to clearly define such definitions and allowments for marketing, is that the fault of Tesla or the governing body?
It's Tesla's. They marketed a product that does not do what they claim it does. The fact that when it does not do those things it can cause (deadly) harm to others, is why they received such a steep adverse judgment.
>I'm pretty neurotic about vehicle safety and I still don't think this clearly points to Tesla as being in the wrong with how they market these features. At best it's subjective.
Who cares how neurotic you think you are? You haven't come across reasonable in this conversation at all.
> At best it's subjective.
It's objectively not autopilot.
The FAA does define how autopilot can and should be used, and so should the the NHTSA (mixed up the transpo acronyms) for ADAS. I suspect the FTC may address false marketing claims if NHTSA does not.
> You haven't come across reasonable in this conversation at all.
This is a discussion. We can disagree. No need to attack me.
> No need to attack me.
I'm not attacking you, it's a direct response to your frequent appeals to yourself as some sort of authority for reason and sensibility in this discussion, when your responses clearly indicate that you are being neither reasonable nor sensible.
> The FAA does define how autopilot can and should be used,
Yeah... in airplanes.
"it's never the crime... its the cover up". So in this case, they are kinda screwed.
I've owned two Tesla's ( now a Rivian/Porsche EV owner). Hands down Tesla has the best cruise control technology in the market. There-in lies the problem. Musk constantly markets this as self driving. It is NOT. Not yet at least. His mouth is way way way ahead of his tech.
Heck, stopping for a red light is a "feature", where the car is perfectly capable of recognizing and doing so. This alone should warrant an investigation and one that i completely, as a highly technical user, fell for when i first got my model 7 delivered... Ran thru a red light trying out auto pilot for the first time.
I'm honestly surprised there are not more of these lawsuits. I think there's a misinterpretation of the law by those defending Tesla. The system has a lot of legalese safe-guards and warnings. But the MARKETING is off. WAY OFF. and yes, users listen to marketing first.
and that ABSOLUTELY counts in a court of law. You folks would also complain around obtuse EULA, and while this isn't completely apples to apples here, Tesla absolutely engages in dangerous marketing speak around "auto pilot". Eliciting a level of trust for drives that isn't there, and they should not be encouraging.
So sorry, this isn't a political thing ( and yes, disclaimer, also a liberal).
Signed... former Tesla owner waiting for "right around the corner" self driving since 2019...
> ABSOLUTELY counts in a court of law
Are there clear guidelines set for labeling and marketing of these features? If not, I'm not sure how you can argue such. If it was so clearly wrong it should have been outlined by regulation, no?
Tesla's not being treated unfairly. It advertised Autopilot as having more capabilities than it actually did. Tesla used to sell Autopilot as fully autonomous. ("The driver is only there for legal reasons.")
And it didn't warn users about this lack of capabilities until it was forced to do so. Those warnings you're talking about were added after this accident occurred as part of a mandated recall during the Biden administration.
> Those warnings you're talking about were added after this accident occurred as part of a mandated recall during the Biden administration.
If that's the case, this is certainly a stronger argument. I thought autosteer and FSD always had this dialog. As far as I know these dialogs go back 10 years and this was April 2019.
Even still find retroactive punishment of this to be dubious. If Tesla is liable to some degree so should the NHTSA, to the extent that anyone who makes the rules can be, for not defining this well enough to protect drivers.
If Tesla is liable to some degree so should the NHTSA,
That's ridiculous. Tesla chose to make dangerous claims that resulted in the loss of dozens of lives. Tesla alone should be liable for this, not the regulator that eventually forced them to add the disclaimers so that consumers would have at least some modicrum of notice that Tesla's advertising was actually just a package of lies.
> Autopilot is cruise control. When you understand this, claiming that Tesla is partially at fault here does not match the existing expectations of other driver assistance tech.
The problem is for several years they actively targeted a customer base incapable of understanding the limitations of the mis-named system they advertised. (Those customers capable of understanding it were more likely to buy vehicles from brands who advertised more honestly.) While the current approach of targeting Nazi and friend-of-Nazi customers might eventually change the story (with its own risks and downsides, one imagines), for the time being it seems reasonable that Tesla bear some responsibility for the unsafe customer confusion they actively courted.
> This all comes down to an interpretation of marketing speak. If you believe "autopilot" is misleading you'd agree with the jury here, if you don't you wouldn't. I'm no lawyer, and don't know the full scope of requirements for autopilot like features, but it seems that Tesla is subject to unfair treatment here given the amount of warnings you have to completely ignore and take no responsibility for. I've never seen such clear warnings on any other car with similar capabilities. I can't help but think there's maybe some politically driven bias here and I say that as a liberal.
And that's exactly why the law is supposed to have a Reasonable Person Standard.
https://en.wikipedia.org/wiki/Reasonable_person
When the majority of Tesla's owners are completely unaware of the viability of autopilot even in 2025, how exactly does it make any sense to blame the marketing when someone was so entrusting in the unproven technology back in 2019? Especially given so many reports of so many people being saved by said technology in other circumstances?
I imagine these things will get better when courts would not be able to find jurors that are unfamiliar with the attention-monitoring nags that Tesla's are famous for.
If this is the 300M jury case 100% they will win in appeals. The driver is clearly responsible for driving and there’s never a moment of doubt about it with Autopilot
Note that the driver wasn't found to be fault-free (they got something like two thirds of the blame), so it's unclear why appeals would overturn this.
For what it's worth:
Hsu v. Tesla, Inc. (Los Angeles Superior Court, Case No. 20STCV18473). Autopilot allegedly swerved a Model S into a median on city streets; plaintiff also claimed an unsafe airbag deployment and misrepresentation. Final result: Tesla won (defense verdict). Jury awarded zero damages and found no failure to warn; verdict entered April 21, 2023.
Molander v. Tesla, Inc. (Riverside County Superior Court). Fatal 2019 crash (driver Micah Lee) where plaintiffs said Autopilot suddenly veered off the highway into a palm tree; suit alleged defective Autopilot and failure to warn. Final result: Tesla won (defense verdict). Jury sided with Tesla on October 31, 2023; no plaintiff recovery.
Huang v. Tesla, Inc. (Santa Clara County Superior Court). 2018 Mountain View fatal crash (Walter Huang) while Autopilot was engaged; wrongful death, defect, and failure-to-warn theories. Final result: Settled confidentially in April 2024 before trial.
Estate of Jeremy Banner v. Tesla, Inc. (Palm Beach County; related 4th DCA appeal). 2019 Delray Beach fatal crash where a Model 3 on Autopilot underrode a crossing tractor-trailer; plaintiffs alleged Autopilot was defective and oversold. Appellate development: In Feb. 2025, Florida’s Fourth DCA limited the case by blocking punitive damages that a trial court had allowed. Final result: Settled in July 2025 (confidential) before a compensatory-only trial could proceed
Appeal of what and on what grounds? Are you an attorney or are you just making this up?
Wouldn’t you be shocked to learn the guy with the username that specifically goes out of their way to say how much they know doesn’t actually know a damned thing about what they are talking about.
the username is ironic silly bun
I am making this up
Based on (almost?) all prior cases though.