Because AI scraping is everywhere and flooding sites with useless traffic. It’s not ideal, but it’s the best people can do atm

"It's not ideal" is an understatement, I have to do stupid captchas for about half my Google searches.

What kind of blog gets flooded by what, 10/100 req/s at max? Seems somewhere along the line we forgot how to deploy and run infrastructure on the internet, if some basic scrapers manage to down your website.

I have a disconnect here too, which makes it feel like I'm missing something. I'm hit so often with very onerous captcha-like demands or just blocked entirely when trying to make a single page or login request. I understand restrictions for rate limiting and things, but at this rate it feels like it's only a matter of time before the whole web is behind voluntary ID-scan requests for "security" even if laws never come to pass.