Sensor fusion is an issue, one that is solvable over time and investment in the driving model, but sensor-can't-see-anything is a show stopper.

Having a self-driving solution that can be totally turned off with a speck of mud, heavy rain, morning dew, bright sunlight at dawn and dusk.. you can't engineer your way out of sensor-blindness.

I don't want a solution that is available to use 98% of the time, I want a solution that is always-available and can't be blinded by a bad lighting condition.

I think he did it because his solution always used the crutch of "FSD Not Available, Right hand Camera is Blocked" messaging and "Driver Supervision" as the backstop to any failure anywhere in the stack. Waymo had no choice but to solve the expensive problem of "Always Available and Safe" and work backwards on price.

> Waymo had no choice but to solve the expensive problem of "Always Available and Safe"

And it's still not clear whether they are using a fallback driving stack for a situation where one of non-essential (i.e. non-camera (1)) sensors is degraded. I haven't seen Waymo clearly stating capabilities of their self-driving stack in this regard. On the other hand, there are such things as washer fluid and high dynamic range cameras.

(1) You can't drive in a city if you can't see the light emitted by traffic lights, which neither lidar nor radar can do.

LIDAR is notoriously easy to blind, what are you on about? Bonus meme: LIDAR blinds you(r iPhone camera)!