> Seems like a power outage is a an obvious use case Waymo should have foreseen
We have zero evidence a power outage wasn't foreseen. This looks like a more complex multi-system failure.
> Seems like a power outage is a an obvious use case Waymo should have foreseen
We have zero evidence a power outage wasn't foreseen. This looks like a more complex multi-system failure.
Does it matter?
Once you’re on public roads, you need to ALWAYS fail-safe. And that means not blocking the road/intersections when something unexpected happens.
If you can physically get out of the way, you need to. Period.
> Does it matter
Yes. OP is inferring Waymo's internal processes from this meltdown. ("Makes me think there are likely other obvious use cases they haven’t thought about proactively either.")
If Waymo literally didn't foresee a blackout, that's a systemic problem. If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case.
> > Does it matter
> Yes. OP is inferring Waymo's internal processes from this meltdown. ("Makes me think there are likely other obvious use cases they haven’t thought about proactively either.")
No, I'm not inferring internal processes.
I'm guessing level of critical thinking.
When you are creating autonomous vehicles, one of the things that you want to risk assess and have mitigation for is what you want the vehicles to do in case the systems they depend on fail (e.g. electricity, comms).
Now, it could be that the team has anticipated those things but some other failure in their systems have caused vehicles to stop in the middle of intersections, blocking traffic (as per article).
I'm super curious to learn more about what Waymo encountered and how they plan to up their game.
> I'm not inferring internal processes…I'm guessing level of critical thinking
Genuine question: how do these differ? Isn’t the level of critical thinking of Waymo’s employees internal to it? (What’s the mens rea analogue for a company?)
The "coinciding problems" should be an assumption, not a edge case we reason away. Because black swan events are always going to have cascading issues - a big earthquake means lights out AND cell towers overloaded or out, not to mention debris in streets, etc.
What they need is a "shit is fucked fallback" that cedes control. Maybe there is a special bluetooth command any police or ambulance can send if nearby, like clear the intersection/road.
Or maybe the doors just unlock and any human can physically enter and drive the car up to X distance. To techies and lawyers it may sound impossible, but for normal humans, that certainly sounds better. Like that Mitch Hedberg joke, when an escalator is out of order it becomes stairs. When a Waymo breaks it should become a car.
> Or maybe the doors just unlock and any human can physically enter and drive the car up to X distance.
Do the even have physical controls to do that at this point?
I’ve never been in one so I don’t know how different they are from normal cars today.
The Waymos still have all their normal driver controls. There is a process where law enforcement can enter the vehicle, call Waymo and verify their law enforcement status, and then switch the vehicle into manual mode and drive it as normal.
Here is their instructions for law enforcement in the Waymo Emergency Response Guide:
https://storage.googleapis.com/waymo-uploads/files/first%20r...
Ok. Thanks. I must have been thinking of something else.
Didn’t Google have little self-driving vehicles without controls that were limited to pre-planned routes on non-public roads on their campus?
Obviously a hugely different problem domain.
>If Waymo literally didn't foresee a blackout, that's a systemic problem.
I agree with this bit
> If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case.
This is what I have a problem with. That’s not an edge case. There will always be a weird thing no one programmed for.
Remember a few years ago when a semi truck overturned somewhere and poured slimy eels all over the highway? No one‘s ever gonna program for that.
It doesn’t matter. There has to be an absolute minimum fail safe that can always work if the car is capable of moving safely. The fact that a human driver couldn’t be reached to press a button to say to execute that is not acceptable. Not having the human available is a totally foreseeable problem. It’s Google. They know networks fail.
This isn't to disagree with your overall point about proper emergency mitigation and having humans available.
> Remember a few years ago when a semi truck overturned somewhere and poured slimy eels all over the highway? No one‘s ever gonna program for that.
While the cause is unusual, this is really just three things that everyone absolutely should be programming into their autonomous vehicles: accidents, road debris, and slick conditions.
Certainly. That one was interesting both because of the odd specifics of it and because it made the road more slippery than any normal accident where just a bunch of boxes of random dry goods fell over.
It just happens to make a fantastic example of “thing no one is ever going to foresee“.
If there wasn’t footage how many people would even believe it happened?
A fail-safe is EXACTLY blocking roads at intersections without power, not proceeding through intersections without power. It's much safer to be stopped than to keep going. I honestly wish the humans driving through blacked out intersections without slowing down in my neighborhood last night actually understood this.
It’s not a fail-safe. It’s a different failure mode. Jamming up traffic, including emergency traffic, creates systemic problems.
It’s a bit like designing an electronic lock that can’t be opened if the power goes out. If your recourse to exiting a dangerous situation becomes breaking the door, then the lock is unsafe.
Fail-safe means "in a situation where the function fails, fail in a way that doesn't cause injury" -> the cars didn't know how to proceed, so they stopped, with their lights on, in a way that any attentive driver could safely navigate... which is a failing safe.
The alternative here, is a protocol that obviously hasn't been tested. How on earth are you going to test a Waymo in blackout conditions? I would rather have them just stop, than hope they navigate those untested conditions with vulnerable pedestrians and vehicles acting unpredictable.
Simulate them on a test course? There are absolutely places with street lights and everything that you could test something. Hell since they don’t need to work you can just have some put up in a parking lot to test with. Who cares.
You don’t need to wait for a city blackout to actually test this kind of scenario.
The thing still has cameras. And LIDAR. It should be fully capable of pulling over on its own safely. Why would not having a traffic light prevent that?
Humans are expected to negotiate this. The robots should be too. That’s part of driving. If the lights fail, the law tells you what you’re supposed to do. And it is not stop the intersection.
> Simulate them on a test course?
Yes, what’s the worst that could happen… oh wait… people literally getting killed.
When did a Waymo kill someone on a closed test course?
I'm suggesting that, perhaps, the vehicle will not preform the same way in a dangerous, real world scenario as it would in a training exercise.
> Fail-safe means "in a situation where the function fails, fail in a way that doesn't cause injury"
In a very local sense, this is true. In terms of the traffic system, this can create a systemic problem if the stoppage causes a traffic jam that creates problems for emergency vehicles.
Thus it is a _different_ failure mode.
If someone stops in the middle of traffic because they’re lost, their GPS went out, or they realized that they’re unsafe to drive, we don’t celebrate that as the driver entering a fail-safe mode. We call that “bad judgment” and give them a ticket.
If it precipitates a larger problem where lives are lost, they may be in considerable legal or financial trouble.
I don’t see why we should treat Waymo any differently.
Traffic doesn’t cause injury. Why are we concerned about traffic flow in a blackout situation. The cars stopped at intersections, EMS could use the oncoming lanes. I’m not seeing how it’s not a fail safe, you’re describing it as not being fail-ideal, and I would agree.
One way roads exist.
I'm confused, is your concern that enough Waymos shut themselves down on a one way road, at the same intersection, so as to block the intersection? Yes, I could see that as being a concern. I suspect it would be reported almost immediately, and would be at the top of the list of for the folks at Waymo to address. The cars weren't abandoned. They eventually moved. Though, I suspect they had to be manually driven (virtually or otherwise), out of the way. I can see how this could be a problem, but considering it would likely be at the top of the list of problems for Waymo -- again, during an emergency -- that I suspect it's not a serious concern in the long run.
Would I preferred that they had a light turn on that was flashing "An unknown emergency is occurring, please park me"? Yes, I think that would be a better solution. I would have preferred better performance from Waymo. My entire point here is that I'm happy that in my neighborhood, the Waymos were acting in a fail-safe manner, rather than just winging it.
An intersection without power is just a 4-way stop.
An intersection without power is supposed to be treated as a 4-way stop. An unfortunately high, nontrivial number of drivers last night were not following that rule.
And yet the humans managed.
Even at a normal four-way stop with stop signs people sometimes blow through it. The Waymo has to handle it.
That’s part of driving.
It can creep through at 3 miles an hour if it thinks that’s what’s safe. All it has to do is get out of the intersection.
The outrage people would rightly have at Waymo allowing a number of its vehicles to blow the lights would be huge. People running blacked out lights is unacceptable.
Who said “blow through”?
Waymos know how to handle 4 way stops.
You're anthropomorphizing. Waymos "know" how to handle the 4-way stops that they've been trained to handle.
> Once you’re on public roads, you need to ALWAYS fail-safe.
Yes.
> And that means not blocking the road/intersections when something unexpected happens.
No. Fail-operational is not the only allowable fail-safe condition for automobiles. For example, it is acceptable for loss of propulsion to cause stop-in-lane — the alternative would be to require high-availability propulsion systems, or to require drivers to always have enough kinetic energy to coast to side. This just isn’t the case.
One can argue that when operating a fleet with correlated failure modes the rules should change a bit, but that’s a separate topic.