If you grep GB+ sized files, you'll love it.
At my previous job I often ended up searching raw log files sized around 5-8GB. Regular grep would grind away for 50-60 seconds, while ripgrep returned the same results in 250ms.
I was shocked the first time I used it and have sung it's praises since.
If you can afford it (these products tend to be expensive if they're doing it, and resource hungry if you're doing it, and sometimes both) and you've got logs in files, look at having the logs live in dedicated software.
I have used (at different places) Graylog and Splunk, New Relic is also popular.
Not that ripgrep isn't awesome, but you could have dedicated indexing and a UI that can be shown to non-technical or at least semi-technical people to review logs whereas "Just use ripgrep" really isn't accessible for most users.
Yeah we had ELK for the common logs. This file was a special case that we didn't want indexed and widely accessible because it contained sensitive data. I only occasionally needed to do some digging, so I was fine resorting to ripgrep.
> If you grep GB+ sized files, you'll love it.
TBH more frequently I have "argument list too long" problem, than "one 5GB file" problem.