Does allow bots to access my information prevent other people from accessing my information? No. If it did, you'd have a point and I would be against that. So many strange arguments are being made in this thread.

Ultimately it is the users of AI (and am I one of them) that benefit from that service. I put out a lot of open code and I hope that people are able to make use of it however they can. If that's through AI, go ahead.

> Does allow bots to access my information prevent other people from accessing my information? No.

Yes it does, that's the entire point.

The flood of AI bots is so bad that (mainly older) servers are literally being overloaded and (newer servers) have their hosting costs spike so high that it's unaffordable to keep the website alive.

I've had to pull websites offline because badly designed & ban-evading AI scraper bots would run up the bandwidth into the TENS OF TERABYTES, EACH. Downloading the same jpegs every 2-3 minutes into perpetuity. Evidently all that vibe coding isn't doing much good at Anthropic and Perplexity.

Even with my very cheap transfer racks up $50-$100/mo in additional costs. If I wanted to use any kind of fanciful "app" hosting it'd be thousands.

I'm still very confused by who is actually benefitting from the bots; from the way they behave it seems like they're wasting enormous amounts of resources on both ends for something that could have been done massively more efficiently.

That's a problem with scrapers, not with AI. I'm not sure why there are way more AI scraper bots now than there were search scraper bots back when that was the new thing. However that's still an issue of scapers and rate limiting and nothing to do with wanting or not wanting AI to read your free and open content.

This whole discussion is about limiting bots and other unwanted agents, not about AI specifically (AI was just an obvious example)