If you have ~$25k to buy a H200 then don't buy one. Rent them out much cheaper and keep renting newer models when your H200 becomes an outdated paperweight.
Assuming you ran inference for the full working day, you'd need to run your H200 for almost 2 years to break even. Realistically you don't run inference full time so you'll never realise the value of the card before it's obsolete.
The company I work for is in the defense industry and by contract can't send any code outside their own datacenter. So cloud-rented H200's are a no-go and obviously commercial LLM's as well. so breaking even is not the goal here.
In that case I suggest you buy cheaper desktop cards instead of a H200. Two or three 5090s will let you run decent models at very good speed.