You’re dangerously wrong. Winter tires make a very large difference in braking. If they’re so inconsequential, why are they mandatory in many northern places?

Culture has an impact on what people choose to do. I’ve seen so many Americans with your point of view. It’s maddening. Winter tires save lives!