What was the better solution here then? Assuming there's hundreds or thousands of self-driving cars suddenly driving in environment without any traffic lights. In the pictures you can see six Waymo cars at a single intersection. Assuming some of them had passengers should they all try to turn at the intersection anyway, when their LIDAR says the lane is likely free and pull over to the side? Is that the safest option? Should there be human police to direct the self driving cars through intersections? Or wait out the temporary electricity failure?
I believe the answer is far more complicated than it seems and in practice having the cars stay still might have been the safest option any of the parties could agree on (Waymo's office, the city traffic people, state regulators, etc).
There are people thinking this stuff out and those cars can 100% pull over automatically but an explicit choice was made not to do so for safety.
I think part of the problem is they’ve made it our problem.
Look I like Waymo. I think they’re neat and I trust them far more than any of the other companies. But in my mind being able to handle stuff like this is just a requirement to be on the roads in any non-trivial number. Like if they had two vehicles in this happened then OK that’s a problem but it was two vehicles in an entire city.
When you have enough on the road that you can randomly have six at one intersection you should absolutely be able to handle this by then.
I want them to do good. I want them to succeed. But just like airliners this is the kind of thing where people’s safety comes first.
What we saw happen looks like the safety of the Waymo and its passengers came above everyone else despite having no need to do that. There are certainly some situations where just staying put is the best decision.
The power went out and there are no other hazards on the road is not one of them. They made things worse for everyone else on average in a foreseeable situation where it was totally unnecessary. And that’s not OK with me.
This feels like the kind of thing that absolutely should’ve been tested extremely well by now. Before they were allowed to drive in large volumes.
Effectively they’ve turned any edge case into a potential city-wide problem and PR nightmare.
One driver doesn’t know how to handle a power outage? It’s not news. Hundreds of automated vehicles all experience the same failure? National news.
I live in the affected neighborhood. There were hundreds of drivers that did not know how to handle a power outage... it was a minority of drivers, but it was a nontrivial, but nominally large number. I even saw a Muni bus blow through a blacked out intersection. The difference is the Waymos failed in a way that prevented potential injury, whereas the humans who failed, all fail in a way that would create potential injury.
I wish the Waymos handled it better, yes, but I think that the failure state they took is preferable to the alternative.
Locking down the roads creates a lot of potential injuries too.
And "don't blow through an intersection with dead lights" is super easy to program. That's not enough for me to forgive them of all that much misbehavior.
> is super easy to program
What?!? We’re talking about autonomous vehicles here.
I wouldn't say "super easy" but if an autonomous vehicle isn't programmed to handle:
Then they are 100% not qualified to be on the road. Those are basic situations and incredibly easy to replicate, simulate, and incorporate into the training data.That is to say, they are not edge cases.
Dealing with other drivers in those settings is much harder to do but that's a different problem and you should be simulating your car in a wide variety of other driver dynamics. From everyone being very nice to everyone being hyper aggressive and the full spectrum in between.
The cars already know those are intersections with lights. I'm not talking about that part. Just the basic logic that you don't go through at speed unless there is a green (or yellow) light.
The lights out should be treated as all way red, including pedestrians.
Specifically identifying road signs, traffic lights, and dead traffic lights is a narrow problem that has feasible solutions. To the point where we can reasonably say “yeah, this sub-component basically works perfectly.”
Compared to the overall self-driving problem which is very much not a super easy problem.
Right. You know there are humans somewhere in the city who got confused or scared and mess up too. Maybe a young driver who is barely confident in the first place on a temporary permit, or just someone who doesn’t remember what you do and was already over-stressed.
Whatever, it happens.
This was a (totally unintentional) coordinated screw up causing problems all over as opposed to one small spot.
The scale makes all the difference.
Definitely. The question then becomes how do they respond on the stimulus of other, more experienced drivers?
Eg. if they see 5 cars going around them and "solving" the intersection, do they get empowered to do the same? Or do some annoying honkers behind them make them bite the bullet and try their hand at passing it (and not to worry, other drivers will also make sure no harm comes to anyone even if you make a small mistake)? Human drivers, no matter how inexperienced, will learn on the spot. Self-driving vehicles can "learn" back in the SW department.
Yes, driving is a collaborative activity which requires that we all partner on finding most efficient patterns of traffic when traffic lights fail. Self-driving cars cannot learn on the spot, and this is the main difference between them and humans: you either have them trained on every situation, or they go into weird failure modes like this.
Yeah, the correlated risk with AVs is a pretty serious concern. And not just in emergencies where they can easily DDOS the roads, but even things like widespread weaknesses or edge cases in their perception models can cause really weird and disturbing outcomes.
Imagine a model that works real well for detecting cars and adults but routinely misses children; you could end up with cars that are 1/10th as deadly to adults but 2x as deadly to children. Yes, in this hypothetical it saves lives overall, but is it actually a societal good? In some ways yes, in some ways it should never be allowed on any roads at all. It’s one of the reasons aggregated metrics on safety are so important to scrutinize.
We already have a solution, it's written down in the traffic laws. If the signals fail, treat the intersection roughly like a four-way stop. Everybody learns this in drivers' ed. It's not obscure. If the cars can't follow traffic rules maybe they're not ready to be on the streets unsupervised.
The problem seems to be that the Waymo cars did exactly as you requested and treated the intersections like 4 way stops but kept getting displaced by more aggressive drivers who simply slowed and rolled.
How many non-Waymo accidents happened at intersections during this time? I suspect more than zero given my experiences with other drivers when traffic lights go off. Apparently, Waymo's numbers are zero so humans are gonna lose this one.
The problem here is that safety and throughput are at odds. Waymo chose safety while most drivers chose throughput. Had Waymo been more aggressive and gotten into an accident because it wouldn't give way, we'd have headlines about that, too.
The biggest obstacle to self-driving is the fact that a lot of driving consists of knowing when to break the law.
> The problem here is that safety and throughput are at odds. Waymo chose safety while most drivers chose throughput.
Did they? They chose their safety. I suspect the net effect of their behavior made the safety of everyone worse.
They did such a bad job of handling it people had to go around them, making things less safe.
We know what people are like. Not everyone is OK doing 2-3 mph for extended time waiting for a Waymo to feel “safe”.
Operating in a way that causes large numbers of other drivers to feel the need to bypass you is fundamentally worse.
> Did they? They chose their safety. I suspect the net effect of their behavior made the safety of everyone worse.
There is no viable choice other than prioritizing the safety of your rider. Anything less would be grounds for both lawsuits and reputational death.
The fact that everybody else chose throughput over safety is not the fault of Waymo.
Will you also complain when enough Waymo cars start running on the freeways that a couple of them in a row can effectively enforce following distances and speed limits, for example?
Obstructing traffic is also against the law.
Something I had pounded into me when I drove too slowly and cautiously during my first driving test, and failed.
Those Waymos weren't moving which is a pretty egregious example of obstructing traffic.
An old rule of thumb is every time a service expands by an order of magnitude there are new problems to solve. I suspect and hope this is just Waymo getting to one of those points with new problems to solve, and they will find a way to more graciously handle this in the future.
> Will you also complain when enough Waymo cars start running on the freeways that a couple of them in a row can effectively enforce following distances and speed limits, for example?
In my state, that would itself be a traffic violation, so yes I would. The leftmost lane on an interstate highway is reserved for passing. An autonomous vehicle cruising in that lane (regardless of speed) would therefore be programmed in a way that deliberately violates this law.
Enforcement is its own challenge, whether robots or humans.
> The leftmost lane on an interstate highway is reserved for passing.
Sadly, in most states, this is not true anymore. Most of those laws have been repealed.
I was very pleasantly surprised when I was in Colorado that they had explicit signs saying that if you had 5 (I think) or more cars behind you that you were supposed to pull right and let them pass.
However, I wasn't really thinking about a Waymo cruising in the left lane but simply 4 or 5 Waymo's in the right lane going right at the speed limit with proper following distance. That's going to effectively lock the right lane to the speed limit which then means that even a single other car would lock the left lane to the speed limit as well. Basically, even a couple of Waymos in the right lane would drop freeway speeds dramatically.
I was under the impression these laws have become much more common over the past decade or two when they were a rarity beforehand. My home state (MN) for example didn’t have one for the first 15 years or so of me driving. Much to my chagrin after I learned about how much better life can be by spending time in a state (KY) where it was strictly enforced by both social convention and law enforcement.
Surprisingly it seems to even be moderately enforced these days even in Minnesota, which I’d have bet money on never happening since it’s a state pastime to play passive aggressive traffic cop for many.
https://www.reddit.com/r/coolguides/comments/1fuw97s/a_cool_...
Perhaps not the most trusted source on the matter, but at a glance it seems more or less the vast majority of states have laws that effectively “ban” slow moving traffic in the left lane from impeding traffic. Enforcement I’m sure will be all over the map - likely down to even a county level within individual states.
While I do wish all states were “purple” or above in that map, the situation seems largely up to how state patrol and other agencies want to enforce it.
I’ve seen it enforced now with my own two eyes in KY, WI, MN, and IL.
That may be the rules for humans, particuarly people who are always in a rush and won't stay still anyway. With a major intersection turned four-way stop you have lots of humans making very complex decisions and taking a lot of personal risk. If multiple self driving cars make the choice at the wrong time you could jam up an intersection and create a worse traffic issue, or kill a passenger.
It's all a careful risk calculation, those self driving cars need to determine if it's safe to continue through an intersection without the traffic lights their computers spent millions of hours to train on (likewise with humans). That's a tough choice for a highly regulated/insured company running thousands of cars.
If anything, their programming should only take such a risk to move out of the way for a fire truck/ambulance.
In a traffic jam situation, getting into a collision that "kills a passenger" is usually very hard (10-20mph collision between cars won't get anyone dead, except maybe for someone who gets another health condition triggered). With large cars in the road today, pedestrians are at a great risk, but similarly have more time to react due to slower speeds.
> If multiple self driving cars make the choice at the wrong time
Would would they do that? It's a hive, isn't it?
> Assuming there's hundreds or thousands of self-driving cars suddenly driving in environment without any traffic lights.
Self-driving cars should (1) know how to handle stops, and (2) know that the rules for a failed traffic light (or one flashing red) are those for an all-way stop.
In a traffic jam situation, all-way stop is a "blocked intersection" if every vehicle obeyed it strictly (if there are cars coming from every direction, nobody has right of way and you can't apply the rule of yielding right).
Humans, luckily, never follow the rules to the letter, which made it reasonable to put them down like this: some will be more impatient/aggressive, others will establish eye contact and wave one another through, etc.
In a situation like this where you've got "drivers" who can't collaborate and learn on the spot, the rule does not make sense.
> What was the better solution here then?
Just pulling over and getting out of the way really would help. There's no reason a human couldn't do the same safely. Not beta testing your cards on public roads would really be ideal. Especially without human drivers ready to take over.
Tbh I'm surprised waymo didn't have remote monitors who could handle cars at intersections or safely pull to the side
The better solution? To not fetishize technology.
Uh, how about having their remote driver staff take over?
> but an explicit choice was made not to do so for safety.
You know this how?
That’s what they usually do. The assumption here is that due to the blackout or some other related issue the human drivers were unavailable.
However even if that’s not true if they have more cars than human drivers there’s gonna be a problem until they work through the queue. And the bigger that ratio, the longer it will take.
I guess that in a blackout they should just have the cars park somewhere safely. Maybe it'd be best to never have more cars on the road than assisting/available human drivers. As soon as no human drivers are available to take over for outage/staffing/whatever reason all cars should just pull over and stop.
This only works if they have cell service and enough human drivers to handle all of their cars.