People (and also frustratingly LLMs) usually refer to https://openai.com/api/pricing/ which doesn't give the complete picture.
https://developers.openai.com/api/docs/pricing is what I always reference, and it explicitly shows that pricing ($2.50/M input, $15/M output) for tokens under 272k
It is nice that we get 70-72k more tokens before the price goes up (also what does it cost beyond 272k tokens??)
> Prompts with more than 272K input tokens are priced at 2x input and 1.5x output for the full session for standard, batch, and flex.
Thanks, it looks like the pricing page keeps getting updated.
Even right now one page refers to prices for "context lengths under 270K" whereas another has pricing for "<272K context length"