Yes you can set price caps. The cost of a query is understandable ahead of time with the default pricing model ($6 per TB of data processed in a query). People usually get caught out by running expensive queries recursively. BigQuery is very cost effective and can be used safely.
You can tell someone has worked in the cloud for too long when they start to think of $6 per database query as a reasonable price.
We really need to go back to on-premise. We have surrendered our autonomy to these megacorps and now are paying for it - quite literally in many cases.
Surely most queries should process much less than 1 TB of data?
My 3TB, 41 billion row table costs pennies to query day to day. The billing is based on the data processed by the query, not the table size. I pay more for storage.
Can you actually set "price caps"?
Most of the cloud services allow you to set alerts that are notorious for showing up after you've accidentally spend 50k USD. So even if you had a system that automated shutdown of services when getting the alert, you are SOL.
Running ripgrep on my harddrive would cost me $48 at that price point.
BigQuery data is stored (I assume) in column oriented files with indices, so a typical query reads only a tiny fraction of the stored data.