> This hurts competent people who want to have responsibility and the freedom that brings.
Perhaps, but does it hurt more or less than getting life-changing injuries and your partner killed by a Tesla?
> This hurts competent people who want to have responsibility and the freedom that brings.
Perhaps, but does it hurt more or less than getting life-changing injuries and your partner killed by a Tesla?
So, we have to ignore the entire safety record for the entire technology just because one operator has failed to follow the instructions?
This is especially the case for something that was in its infancy back in 2019 when this crash happened.
And you know what we have in 2025 because of those restrictions being enforced since then?
In 2025, Tesla's nag drivers so much, for not paying attention to the road, that drivers no longer keep the much safer versions of autopilot engaged at all, when looking for their phones.
Instead, now, because the issue is "fixed", Tesla drivers simply do the same thing what drivers of any other car do in the situation.
They disable autopilot first, and only then stop paying attention to the road, looking for their phone.
How's that safer?
We're precisely less safe because of these regulatory requirements.
(And, add insult to injury, this court is now using the hindsight 20/20, of these warnings subsequently being implemented, as evidence of Tesla's wrongdoing in 2019, at a time before anything like that was thought to be possible? Even though, now that these warnings were implemented, we already have evidence that these nags themselves make everybody less safe, since autopilot is simply turned off when you need to stop paying attention to the road?)
What safety record are we ignoring? Can you please cite some scientifically rigorous and statistically sound data, evidence, and analysis?
Or are you talking about self-published numbers by the company that is proven to withhold, lie, and misdirect in even official police investigations, subpoenas, and trials where it is actively illegal to do so?
Are we talking numbers with a degree of scientific rigor unfit for publication in a middle school science fair, let alone the minimum standard of scientifically rigorous that members of their team had to achieve to get their degrees, yet somehow fail to do when detailing systems that are literally responsible for the life and death of humans?
So where's your unbiased data then?
Where's your data that these nags make everyone safer, when it's widely known that they simply result in people turning off the entire autopilot/FSD when the operator needs to stop paying attention to the road, to avoid the nags and the penalty strikes?
Where's all the news reports about the crashes without the autopilot engaged? If they were as rare as the autopilot ones, surely we'd have seen some of them covered by the media, right? Or are they so rare that not a single one has happened yet, hence, the lack of any reports being available?
You are the one claiming it has a “safety record”.
You are the one claiming “We're precisely less safe because of these regulatory requirements.”
Support your assertion with scientifically rigorous, statistically sound evidence.
And no, your ignorance of safety problems is not evidence of safety despite your attempts to argue as such. That was not a valid argument when the cigarette companies made it and it is not valid now.
The law is not supposed to make the utilitarian choice of what causes the least net harm to all people, individual rights be damned. That someone died while unchained is not a compelling argument for everyone to be put in irons.
Everyone without a driver's license and liability insurance is in chains with respect to driving a car on on public roadways and it is broadly unanimous opinion that it makes perfect sense. Its a mix of utilitarian grounds, externalities, etc.
You can't build a large dynamite factory in a residential neighborhood either even if you don't intend for it to blow up.
Well yeah, the law isn't "do whatever you want, there are no rules" either. The important thing is getting your license and insurance is purposefully an extremely low bar. Driving is a dangerous activity, someone could hurt or kill themselves or others every time they get behind the wheel, but so long as you have demonstrated the very most basic understanding of what you're doing (ie getting a license), and taken responsibility for the potential consequences of your actions (ie carrying insurance), you're free to take that risk. Note, you still don't have the right to run over pedestrians, but if you do, you will be held liable, not the manufacturer of your car or the state which granted you a license. You would likely lose your license under such a circumstance, but no one else will lose their licenses to mitigate the risk of future pedestrian impacts.
Zoning laws are a complete non sequitur. The issue with building a large dynamite factory in a residential neighborhood is the threat to the people of the neighborhood who not only didn't consent to live near it but specifically chose to live in an area zoned so that such things could not be built. Building a dynamite factory wherever you want is not something you have the innate right to do. That said, you probably can get a permit (assuming you have the proper licenses and insurance) to build a dynamite factory in an appropriately zoned area.
There is no car sold in America which could have prevented this accident.
Yes, that is exactly Tesla's line, and it is entirely beside the point.
The point is the EXPECTATION set by decades of basically flat-out lying about the capabilities of "Autopilot" and "Full Self Driving".
Hell, just the names alone entirely point to no-limitations full self driving. Plus selling $10k upgrades to use your car as a driverless taxi with the next software upgrade, which years later, has never happened. The fine print is BS; the message screams nothing but capable driving.
Add to that the instant data collection and real-time deleting it at the scene, and Tesla is 100% wrong. (and I used to admire Tesla and aspire to own one)
And this is 100% Tesla's own fault. They did NOT have to market it that way. Had they marketed it as "Advanced Driving Assistant" or "Super Lane-Keeping", or something, and left the data available to the driver, they likely could have won this case easily. "The guy wasn't even looking, we never even implied it could take over all driving, what the hell was he thinking?".
Actual "autopilot" such as the kind actual pilots use does not give the pilot the ability to disregard the operation of their airplane.
How does calling a feature "autopilot" then give consumers the impression that they can completely hand over operation of the car. Driving a car is a serious task and this driver was extremely negligent in executing that task.
The "Autopilot" on planes is a lovely technical distinction. Hardly any non-aviator knows the details, and the common idea is that the plane pilots itself in most phases of non-emergency flight. And indeed, some autopilot functions can even auto-land according to the flight plan. But there is a bit of wiggle-room there to say people should expect to pay attention.
Tesla's "Full Self Driving" leaves no such room. The only exception would be to consider Tesla to be lying.
"Full", adjective Containing all that is normal or possible. "a full pail." Complete in every particular. "a full account."
That's just the first DDG search. Go to ANY other dictionary and show me where "Full" in ordinary American English usage means anything other than complete, no exceptions, etc.
"Full Self Driving" literally means "It FULLY drives itself".
It does NOT mean "You and the automobile co-drive the car", or "Mostly self-driving", or "Sometimes self-driving", or "Self driving until it can't figure it out or fcks up", the latter of which seems the most accurate.
If Tesla had called it ANY of those other things, they would be fine. And would be honest.
But instead, Musk and Tesla decided to lie in plain language. And for that, they lost the case, and will likely lose many others.
For some people, honesty matters. And honesty is not splitting hairs over differences between detailed technical meaning vs colloquial understanding ("autopilot"), or using the end-goal a decade+ away as the official name and description of the feature set ("Full Self Driving"). That is dishonest.
Driving a car is not a "colloquial" activity; it is a technical activity. Splitting hairs about the technical operation of a car is exactly what a competent operator should be doing.
Regardless, this car did not have FSD, it had "autopilot".
Driving a car in the USA with a license requires passing about a 5-minute written test about rules of the road and five minutes of a driving test. It is an almost universal event for residents of the country.
Piloting an aircraft with an autopilot requires a MINIMUM of 1500 hours of instruction and experience as well as multiple levels of certification (VFR, IFR, multi-engine, and specific type certification).
You are seriously trying to claim that these are even remotely similar activities?
Yes, drivers SHOULD split hairs over the technical operation of the vehicle.
Should and Is/Are/Do are NOT the same thing. Particularly when the company founder and prime spokesperson brays about how it will do everything for you and constantly overpromises stuff that won't be available for a decade (if ever) as if it were here already
sheesh
> Piloting an aircraft with an autopilot requires a MINIMUM of 1500 hours of instruction and experience as well as multiple levels of certification (VFR, IFR, multi-engine, and specific type certification).
What? A full private pilot license only requires 35 or 40 hours of flight time (depending on school type); a sport license only requires 20 hours. Airplanes flyable by new pilots of either type very often have autopilots today.
But Tesla doesn't have "Full Self Driving" period, they only have "Full Self-Driving (Supervised)" (and, prior to that, it's been FSD Beta).
You can't just keep ignoring the "Supervised" bit as if it's not there. Just because you think it's a stupid name, doesn't make it a lie. Have you even tried it yourself? I've tried v12, and it's amazing. It does fully self-drive. Why would they call it "mostly" if the idea has always been that it'd be "full" when it's done and out of beta? And as robotaxi shows, it's literally almost there.
I've just tried searching "FSD site:tesla.com" in Google Search, and basically every single result is "Full Self-Driving (Supervised)", with very few exceptions.
Hint: what does the word ‘full’ mean?
"Full" means it drives itself as long as you "Supervise".
Did you use v12 or v13 FSD? I've used v12 last year (which is far behind v13, and yet more behind robotaxi). I'd enable it as soon as I'm out of the garage, and it'd safely drive me right to the destination.
How exactly is that not "Full"? Why would they call it anything else when it can drive itself from point A to point B without any interventions most of the time?
>>Why would they call it anything else when it can drive itself from point A to point B without any interventions most of the time
Why?
Because the FACT is that you must put in the caveat MOST OF THE TIME, or someone is likely to die.
If they were honest they would call it "Supervised Mostly Self Driving". Even "Supervised Self Driving — you mostly supervise, not drive!" would be accurate.
Again, go to any dictionary and find the definition of "Full". Websters:
>>1 containing as much or as many as is possible or normal
>>2a complete especially in detail, number, or duration
>>2b lacking restraint, check, or qualification
>>2c having all distinguishing characteristics : enjoying all authorized rights and privileges
>>2d not lacking in any essential : perfect
The problem is Tesla and you are attempting to literally change the definition of "Full" to mean "Not Full".
This is lying or deceiving yourself, and deceiving others.
Recognize facts and that language actually has meaning, and stop being part of the problem. Or continue watching your hero lose lawsuits.
>>Why would they call it "mostly" if the idea has always been that it'd be "full" when it's done and out of beta?
Because "Mostly..." is the truth, and then when it is actually "Full..." they can come out and announce that fact with great fanfare. and they would have been honest.
Hell, if they simply called it "Supervised Self Driving", it would be honest, and actually match even your glowing description.
But they do not. Your and Tesla's idea that using the added tagline "(Supervised)" as a legal weasel-word does not work either. "Full Self-Driving (Supervised)" is literally an oxymoron. A thing either drives itself fully, or it requires supervision. One contradicts the other.
IIRC, the "(Supervised)" bit was added well after the first fanfare with only "Full Self Driving" alone, when problems started to appear. And the common initials are "FSD".
Even if the reality of the feature set meets your glowing description, the problem is the small percentage of cases where it fails. I'm sure the guy in Florida who was decapitated when his Tesla failed to notice a semi-trailer turning across in front of him was similarly confident, and the same for the guy in California who was impaled on a construction traffic barrier. The problem is that it is NOT FULL, it is only full self driving until it fails.
>>And as robotaxi shows, it's literally almost there.
NO, it shows the exact opposite.
Nearly two months after the much-heralded rollout of (fully) self-driving taxis, Tesla still cannot put a single car on the road for a single meter without a supervising safety driver. Moreover, there have been numerous reported instances of the cars making dangerous errors such as left turns into traffic, etc.
>>basically every single result is "Full Self-Driving (Supervised)", with very few exceptions.
Again, that wording is a literally meaningless oxymoron, containing two mutually contradictory statements ("Full" vs "Supervised"), thus open to whatever interpretation the listener latches onto. Moreover, the emphasis is on the first word — "Full" — which is the lie.
I mean, this is just wrong.
Autopilot on a plane does actually drive the plane. You can go as long as hours without any human input requirement. Pilots can eat, go to the bathroom, what have you.
Of course we have two pilots, just in case, but this isn't necessary - some countries are pushing for one pilot because the vast majority of flying is done by the plane.
That doesn't mean that autopilot systems on planes are more sophisticated. It just means that automating a plane is much, much easier than automating a car.
We also have fully autonomous trains.
First of all, I don't condone the unlinking of snapshot_collision_airbag-deployment.tar after the upload, but, OTOH, I can also understand why something like that would be done, too. (If you were an owner of the car, would you want a subsequent owner to have the records of your crash? This is why I hate all these privacy advocates, because they ruin it for everyone.)
If you've ever worked at any company where security and privacy are taken seriously, you'd be fully aware that things like logging incorrect password attempts is a straight up CVE waiting to happen, even though it's something that a legitimate user might as well want to be happening to find out who's trying to break into their system. Thank the privacy advocates.
But I fail to see who exactly is misled by the marketing, ESPECIALLY given the non-stop negative media attention Tesla has always had.
I mean, it's literally already called FSD Supervised, and previously it's been FSD Beta. How exactly is that not self-explanatory?
But if you already a conclusion, the name is irrelevant. I mean, did you ever look at the Autopilot article in Wikipedia? It's about the aircraft system. Why do not we not FAA complaining to Boeing and Airbus that their airplanes have this super-misleading "autopilot", even though the pilots must still be supervising the tech?
Yikes
You entirely miss the distinction between trained professional and ignorant consumer, "industrial use only" equipment and materials vs everyday consumer goods, things that require technical training and even certification to use properly, vs goods sellable to any consumer, prescription drugs vs otc.
The technical goods carry a much higher risk and must be used with specific training and context. Using them outside those contexts is likely to harm or kill people, and creates legal liability.
In contrast, consumer goods must be engineered to be safe in ORDINARY circumstances with UNTRAINED people using them.
Tesla is trying to paper over those differences and use technical terms in a common environment and unleash tools requiring skilled supervision to prevent death into a consumer environment, for profit. You are either failing to make the distinction or consciously going along with it.
decades?
Musk joined Tesla with the Series B round in 2005. Doesn't seem that way, but it's been 20 years.
He first started talking publicly about "Autopilot" or "Full Self Driving" in 2013, so 1.2 would be referred to as plural decades. (I didn't have the exact number on hand, but knew it was 1+, and used the proper form; you prompted me to lookup the proper number)
And would the driver’s actions have been different if they had understood that? Was their lack of understanding coincidence, correlated with their Tesla ownership by no fault of Tesla, or deliberately engineered by Tesla’s marketing approach?
You know, I've talked to a whole bunch of people who actually own Tesla's, who actually work in tech, and most of them are completely unaware about any of these autopilot features whatsoever.
Most people are actually very dismissive of autopilot, and are completely misinformed of the benefits / drawbacks / differences of "FSD" versus "Autopilot".
Most are completely unaware of the improvements of v12 or v13, or differences between HW3 or HW4, or which one they have, or that "autopilot" is free, or circumstances under which autopilot can be used etc.
I talked to some guy last year (mid 2024) who was actually paying $199/mo for FSD v12, before the price drop to $99/mo, and swearing how great it was, yet he has never tried the parking feature, even though it's been released several months prior. He's a software engineer. That's just one example.
So, if anything, Tesla's marketing is nowhere near as successful as these naysayers would make you believe. Because the vast majority of Tesla's own customers are actually far behind on autopilot or FSD buy-in, and are NOT aware of the progress.
An average of 100 people die every day in the US due to traffic accidents, many of which would have been prevented by Tesla-like software. You're obsessing about the wrong side of the equation.