I don’t know. But if human drivers are having to go around them, they’re not doing the right thing.
They need to drive or pull over. Never just stop there in the road and wait.
I don’t know. But if human drivers are having to go around them, they’re not doing the right thing.
They need to drive or pull over. Never just stop there in the road and wait.
> if human drivers are having to go around them, they’re not doing the right thing
They're not. But it's also not a disaster. Pretending it is on Twitter is pandering, not policymaking.
> They need to drive or pull over. Never just stop there in the road and wait
Agreed. Waymo has a lesson to learn from. Sacramento, and the NHTSA, similarly, need to draw up emergency minimums for self-driving cars.
There are productive responses to this episode. None of them involve flipping out on X.
> But it's also not a disaster
Because it’s a power outage. If we instead learned about this during a real disaster people could have died because these things were let on the road without planning what they should do in abnormal circumstances.
We’re lucky it’s not a disaster.
> If we instead learned about this during a real disaster people could have died
This is universally true. The question is how bad could it have been, and in which cases would it have been the worst?
> We’re lucky it’s not a disaster
This is always true. Again, the question is how lucky?
We have an opportunity to count blocked emergency vehicles and calculate a hypothetical body count. This lets us characterize the damage. But it also permits constraining hysteria.
> We’re lucky it’s not a disaster.
I'm sure that if this was something predictable like a cyclone or wildfire, Waymo would still have 100% of their nightly traffic on the road, right? And SFFD would not be able to do what they normally do when they can't get support, which is hop into the car and use the controls to manually move it?
Or... maybe Waymo HAS considered what their cars should do in abnormal circumstances and this kind of outcome was considered acceptable for the number of cars and the nature of the "disaster"?
> Waymo has a lesson to learn from.
At what point can we be spared from having Waymos lessons inflicted upon us
In this country, if heart disease or cancer doesn't kill you, a car probably did.
Until "Waymos lessons" are killing people at that rate, I am 100% OK with a Waymo making my trips an extra 5 minutes longer every 50th trip or whatever else the real stat is.
I was curious if Waymo has even been involved with a crash that killed someone, so I looked it up. The answer is yes - there was a Tesla going 98mph in SoMa whose driver died after hitting a Waymo. Clearly the Waymo's fault!
> At what point can we be spared from having Waymos lessons inflicted upon us
Again, we had a real event happen. Not hypothetical. What was the actual cost inflicted?
When humans can cause fewer accidents and fatalities than Waymo on average. People are still inflicting those lessons on us.
When we learn our lesson that letting companies beta test on public roads consequence free is just another cost to the rest of us so that a small number of people can enrich themselves at our expense.
Whenever they become so much a problem that they counterbalance public and private interests in having and improving robotaxis. For most people, we are nowhere near that.
No, it's not a disaster, but with a little imagination it could be a hormetic innoculation.
> They're not.
They are. I did myself yesterday because one was sitting at the front of a turning lane at a dead light, just waiting there forever with the blinker on.
waymos shouldn't exist, and san francisco shouldn't just be a experimentation lab for tech companies