Hmm, can someone educate me here? Why don't bit flips ever seem to impact the results of calculations in settings like big-data analytics on AWS?
Is it a difference between server hardware managed by knowledgeable people and random hardware thrown together by home PC builders?
In Belgium elections, a party received 4096 unaccounted votes likely due to a bit flip: https://en.wikipedia.org/wiki/Electronic_voting_in_Belgium#R....
Servers and pro workstations normally have ECC RAM.
You can only detect what you measure. Are these big-data analytics processes running multiple times to detect differences?
Presumably professional hardware uses ECC memory, which automatically corrects these kinds of errors.