I don't thing it's purely stubbornness. Tesla sold the promise of software only updates resulting in FSD to hundreds of thousands of people. Not all of those people are in the cult of Tesla. I would expect admitting defeat at this point would result in a large class action lawsuit at the very least.

It wouldn't keep them from equipping _new_ models with additional sensors, spinning a story around how this helps them train the camera-only AI, or whatever.

I know it's "illegal" and technically sold as FSD (assisted), but just 2 days ago I was in a friend's Model Y and it drove from work to my house (both in San Jose) without any steering wheel or pedal touch, at all. And he told me he went to Palm Springs like that too.

I shit on Tesla and Elon on any opportunity, and it's a shame they basically have the software out there doing things when it probably shouldn't, but I don't think they're that far behind Waymo where it really matters, which is the thing actually working.

I suspect they have a long tail problem with FSD. It might work fine 99% but that's simply not good enough.

Palm Springs from San Jose? Albeit freeway throughout but that's quite impressive.

The terms of service probably require you to sue Tesla in that Texas district with his corrupt judge pal.