> Does it matter

Yes. OP is inferring Waymo's internal processes from this meltdown. ("Makes me think there are likely other obvious use cases they haven’t thought about proactively either.")

If Waymo literally didn't foresee a blackout, that's a systemic problem. If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case.

> > Does it matter

> Yes. OP is inferring Waymo's internal processes from this meltdown. ("Makes me think there are likely other obvious use cases they haven’t thought about proactively either.")

No, I'm not inferring internal processes.

I'm guessing level of critical thinking.

When you are creating autonomous vehicles, one of the things that you want to risk assess and have mitigation for is what you want the vehicles to do in case the systems they depend on fail (e.g. electricity, comms).

Now, it could be that the team has anticipated those things but some other failure in their systems have caused vehicles to stop in the middle of intersections, blocking traffic (as per article).

I'm super curious to learn more about what Waymo encountered and how they plan to up their game.

> I'm not inferring internal processes…I'm guessing level of critical thinking

Genuine question: how do these differ? Isn’t the level of critical thinking of Waymo’s employees internal to it? (What’s the mens rea analogue for a company?)

The "coinciding problems" should be an assumption, not a edge case we reason away. Because black swan events are always going to have cascading issues - a big earthquake means lights out AND cell towers overloaded or out, not to mention debris in streets, etc.

What they need is a "shit is fucked fallback" that cedes control. Maybe there is a special bluetooth command any police or ambulance can send if nearby, like clear the intersection/road.

Or maybe the doors just unlock and any human can physically enter and drive the car up to X distance. To techies and lawyers it may sound impossible, but for normal humans, that certainly sounds better. Like that Mitch Hedberg joke, when an escalator is out of order it becomes stairs. When a Waymo breaks it should become a car.

> Or maybe the doors just unlock and any human can physically enter and drive the car up to X distance.

Do the even have physical controls to do that at this point?

I’ve never been in one so I don’t know how different they are from normal cars today.

The Waymos still have all their normal driver controls. There is a process where law enforcement can enter the vehicle, call Waymo and verify their law enforcement status, and then switch the vehicle into manual mode and drive it as normal.

Here is their instructions for law enforcement in the Waymo Emergency Response Guide:

https://storage.googleapis.com/waymo-uploads/files/first%20r...

Ok. Thanks. I must have been thinking of something else.

Didn’t Google have little self-driving vehicles without controls that were limited to pre-planned routes on non-public roads on their campus?

Obviously a hugely different problem domain.

>If Waymo literally didn't foresee a blackout, that's a systemic problem.

I agree with this bit

> If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case.

This is what I have a problem with. That’s not an edge case. There will always be a weird thing no one programmed for.

Remember a few years ago when a semi truck overturned somewhere and poured slimy eels all over the highway? No one‘s ever gonna program for that.

It doesn’t matter. There has to be an absolute minimum fail safe that can always work if the car is capable of moving safely. The fact that a human driver couldn’t be reached to press a button to say to execute that is not acceptable. Not having the human available is a totally foreseeable problem. It’s Google. They know networks fail.

This isn't to disagree with your overall point about proper emergency mitigation and having humans available.

> Remember a few years ago when a semi truck overturned somewhere and poured slimy eels all over the highway? No one‘s ever gonna program for that.

While the cause is unusual, this is really just three things that everyone absolutely should be programming into their autonomous vehicles: accidents, road debris, and slick conditions.

Certainly. That one was interesting both because of the odd specifics of it and because it made the road more slippery than any normal accident where just a bunch of boxes of random dry goods fell over.

It just happens to make a fantastic example of “thing no one is ever going to foresee“.

If there wasn’t footage how many people would even believe it happened?