Effectively they’ve turned any edge case into a potential city-wide problem and PR nightmare.

One driver doesn’t know how to handle a power outage? It’s not news. Hundreds of automated vehicles all experience the same failure? National news.

I live in the affected neighborhood. There were hundreds of drivers that did not know how to handle a power outage... it was a minority of drivers, but it was a nontrivial, but nominally large number. I even saw a Muni bus blow through a blacked out intersection. The difference is the Waymos failed in a way that prevented potential injury, whereas the humans who failed, all fail in a way that would create potential injury.

I wish the Waymos handled it better, yes, but I think that the failure state they took is preferable to the alternative.

Locking down the roads creates a lot of potential injuries too.

And "don't blow through an intersection with dead lights" is super easy to program. That's not enough for me to forgive them of all that much misbehavior.

> is super easy to program

What?!? We’re talking about autonomous vehicles here.

I wouldn't say "super easy" but if an autonomous vehicle isn't programmed to handle:

  1: streetlight with no lights
  2: streetlight with blinking red
    2.5: streetlight with blinking yellow
Then they are 100% not qualified to be on the road. Those are basic situations and incredibly easy to replicate, simulate, and incorporate into the training data.

That is to say, they are not edge cases.

Dealing with other drivers in those settings is much harder to do but that's a different problem and you should be simulating your car in a wide variety of other driver dynamics. From everyone being very nice to everyone being hyper aggressive and the full spectrum in between.

The cars already know those are intersections with lights. I'm not talking about that part. Just the basic logic that you don't go through at speed unless there is a green (or yellow) light.

The lights out should be treated as all way red, including pedestrians.

Specifically identifying road signs, traffic lights, and dead traffic lights is a narrow problem that has feasible solutions. To the point where we can reasonably say “yeah, this sub-component basically works perfectly.”

Compared to the overall self-driving problem which is very much not a super easy problem.

Right. You know there are humans somewhere in the city who got confused or scared and mess up too. Maybe a young driver who is barely confident in the first place on a temporary permit, or just someone who doesn’t remember what you do and was already over-stressed.

Whatever, it happens.

This was a (totally unintentional) coordinated screw up causing problems all over as opposed to one small spot.

The scale makes all the difference.

Definitely. The question then becomes how do they respond on the stimulus of other, more experienced drivers?

Eg. if they see 5 cars going around them and "solving" the intersection, do they get empowered to do the same? Or do some annoying honkers behind them make them bite the bullet and try their hand at passing it (and not to worry, other drivers will also make sure no harm comes to anyone even if you make a small mistake)? Human drivers, no matter how inexperienced, will learn on the spot. Self-driving vehicles can "learn" back in the SW department.

Yes, driving is a collaborative activity which requires that we all partner on finding most efficient patterns of traffic when traffic lights fail. Self-driving cars cannot learn on the spot, and this is the main difference between them and humans: you either have them trained on every situation, or they go into weird failure modes like this.

Yeah, the correlated risk with AVs is a pretty serious concern. And not just in emergencies where they can easily DDOS the roads, but even things like widespread weaknesses or edge cases in their perception models can cause really weird and disturbing outcomes.

Imagine a model that works real well for detecting cars and adults but routinely misses children; you could end up with cars that are 1/10th as deadly to adults but 2x as deadly to children. Yes, in this hypothetical it saves lives overall, but is it actually a societal good? In some ways yes, in some ways it should never be allowed on any roads at all. It’s one of the reasons aggregated metrics on safety are so important to scrutinize.