> The biggest cost is still energy consumption, water and maintenance.
Are you saying that the operating costs for inference exceed the costs of training?
> The biggest cost is still energy consumption, water and maintenance.
Are you saying that the operating costs for inference exceed the costs of training?
The global cost of inference in both Openai and Anthropic it exceed training cost for sure. The reason is simple: the inference cost grows with requests not with datasets. My math simplified by AI says: Suppose training GPT-like model costs
= $ 10,000,000 C T
=$10,000,000.
Each query costs
= $ 0.002 C I
=$0.002.
Break-even:
> 10,000,000 0.002 = 5,000,000,000
inferences N> 0.002 10,000,000
=5,000,000,000inferences
So after 5 billion queries, inference costs surpass the training cost.
Openai claims it has 100 million users x queries = I let you judge.
No. But training an LLM is certainly very very expensive and a gamble every time you do it. I think of it a bit like a pharmaceutical company doing vaccine research…