Had this issue on a ray tracer I worked on. Since sampling was supposed to be random, you could fire it up on multiple machines and just average the result to get a lower noise image.
Except the distributed code fired it up all worker instances almost simultaneously and the code used time() to seed the RNG, so many workers ended up using the same seed and hence averaging those results did nothing.
I am reminded of an article about a poker site:
"There are 52-factorial ways to shuffle a deck of cards, but the site's PRNG only has 32 bits of state. 4 billion is alarmingly less than 52-factorial! But even worse, the PRNG is seeded using the number of milliseconds since midnight. 86 million is alarmingly less than 4 billion!"
So the actual entropy on the card table was equivalent to about 5 cards' worth. After seeing the 2 cards in his hand, and the 3 cards in the flop, he could use a program to solve for every other card in everyone's hand and in the entire deck!
(I may have mixed up many details - If anyone has an archive of the article please post it!)
I assume you're talking about
https://web.archive.org/web/20140210072712/http://www.laurad...
Previously on HN
https://news.ycombinator.com/item?id=7207851