What this post might be missing is that it’s not just Google that can block your website. A whole variety of actors can, and any service that can host user-generated content, not just html (a single image is enough), is at risk, but really, any service is at risk. I’ve had to deal with many such cases: ISPs mistakenly blocking large IP prefixes, DPI software killing the traffic, random antivirus software blocking your JS chunk because of a hash collision, even small single-town ISPs sinkholing your domain because of auto-reports, and many more.
In the author’s case, he was at least able to reproduce the issues. In many cases, though, the problem is scoped to a small geographic region, but for large internet services, even small towns still mean thousands of people reaching out to support while the issue can’t be seen on the overall traffic graph.
The easiest set of steps you can do to be able to react to those issues are: 1. Set up NEL logging [1] that goes to completely separate infrastructure, 2. Use RIPE Atlas and similar services in the hope of reproducing the issue and grabbing a traceroute.
I’ve even attempted to create a hosted service for collecting NEL logs, but it seemed to be far too niche.
[1]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Guides/Net...