It's roughly three times cheaper than GPT-5.2-codex, which in turn reflects the difference in energy cost between US and China.
It's roughly three times cheaper than GPT-5.2-codex, which in turn reflects the difference in energy cost between US and China.
1. electricity costs are at most 25% of inference costs so even if electricity is 3x cheaper in china that would only be a 16% cost reduction.
2. cost is only a singular input into price determination and we really have absolutely zero idea what the margins on inference even are so assuming the current pricing is actually connected to costs is suspect.
It reflects the Nvidia tax overhead too.
Not really, Western AI companies can set their margins at whatever they want.