Way back in the day I had a software product, with a basic system to prevent unauthorised sharing, since there was a small charge for it.
Every time I released an update, and new crack would appear. For the next six months I worked on improving the anti-copying code until I stumbled across an article by a coder in the same boat as me.
He realised he was now playing a game with some other coders where he make the copyprotection better, but the cracker would then have fun cracking it. It was a game of whack-a-mole.
I removed the copy protection, as he did, and got back to my primary role of serving good software to my customers.
I feel like trying to prevent AI bots, or any bots, from crawling a public web service, is a similar game of whack-a-mole, but one where you may also end up damaging your service.
> the cracker would then have fun cracking it.
I wonder if you could've won by making the cracking boring. No new techniques, bare minimum changes to require compiling a new crack, and just enough to make it difficult to automate. I.e. turn the cracking into a job.
But in reality, there are other community-driven motivations to put out cracks.
>No new techniques, bare minimum changes to require compiling a new crack, and just enough to make it difficult to automate.
From a practical perspective you also have to have a steady stream of features for the newer versions to be worth cracking. Otherwise why use v1.09 when v1.01 works fine? Moreover spending less effort into improving the DRM is still playing at the cat and mouse game, albeit with less time investment. If you're making minimal changes, the cracker also has to spend minimal time updating the crack.
So many problems could be solved by letting go.
Unfortunately social media and snowballing copyright maximalism has inflated egos to the point where more and more people think they need to control everything.
If only I could go back in time 26 years and let myself know I was right to focus on my customers.