> I see zero reasons to oppose robots visiting any website I would build.
> preventing search engines from indexing incomplete versions or going the paths which really make no sense for them to go.
What will you do when the bots ignore your instructions, and send a million requests a day to these URLs from half a million different IP addresses?
Let my site go down and then restart my server a few hours later. I'm a dude with a blog I'm not making uptime guarantees. I think you're overestimating the harm and how often this happens.
Misbehaving scrapers have been a problem for years not just from AI. I've written posts on how to properly handle scraping and the legal grey area it puts you in and how to be a responsible one. If companies don't want to be responsible the solution isn't abdicate an open web. It's make better law and enforcement of said law.
Sue them / press charges. DDoS is a felony.