3 reasons basically:
1. non-humans can create much more content than humans. There's a limit to how fast a human can write, a bot is basically unlimited. Without captchas, we'd all drown in a see of Viagra spam, and the misinformation problem would get much worse.
2. Sometimes the website is actually powered by an expensive API, think flight searches for example. Airlines are really unhappy when you have too many searches / bookings that don't result in a purchase, as they don't want to leak their pricing structures to people who will exploit them adversarially. This sounds a bit unethical to some, but regulating this away would actually cause flight prices to go up across the board.
3. One way searches. E.g. a government registry that lets you get the address, phone number and category of a company based on its registration number, but one that doesn't let you get the phone numbers of all bakeries in NYC for marketing purposes. If you make the registry accessible for bots, somebody will inevitably turn it into an SQL table that allows arbitrary queries.
i run a small wiki/image host and for me it's mainly:
4. they'll knock your server offline for everyone else trying to scrape thousands of albums at once while copying your users' uploads for their shitty discord bot and will be begging for donations the entire time too