I have the feeling that it's the small players that cause problems.

Dumb bots that don't respect robot.txt or nofollow are the ones trying all combinations of the filters available in your search options and requesting all pages for each such combination.

The number of search pages can easily be exponential in the number of filters you offer.

Bots walking around in these traps, do it because they are dumb. But even a small degenerate bot can send more requests than 1M MAUs.

At least that's my impression of the problem we're sometimes facing.

Signed agents seems like a horrific solution. And many serving the traffic is just better.