I wonder about this for things like self-driving cats. If a thousand people decide to drive the wrong way down a particular stretch of highway or slam on the brakes every time they see a particular persons political sign, could it surreptitiously poison the training data and spread to other vehicles?

I'd say cats are already pretty much self-driven.

As a person who aren't in USA|Canada, I worry more that cars that were developed there will learn to "turn on red"