The global cost of inference in both Openai and Anthropic it exceed training cost for sure. The reason is simple: the inference cost grows with requests not with datasets. My math simplified by AI says: Suppose training GPT-like model costs

= $ 10,000,000 C T

=$10,000,000.

Each query costs

= $ 0.002 C I

=$0.002.

Break-even:

> 10,000,000 0.002 = 5,000,000,000

inferences N> 0.002 10,000,000

=5,000,000,000inferences

So after 5 billion queries, inference costs surpass the training cost.

Openai claims it has 100 million users x queries = I let you judge.