Chinese companies need to pay much higher prices for the same GPUs, so they would need to charge more to make a profit, but it's difficult to charge more unless they have a much better product. So building massive data centers to gain market share is riskier for them.
That said, Alibaba not releasing the weights for Qwen3-Max and announcing $53 billion in AI infrastructure spending https://www.reuters.com/world/china/alibaba-launches-qwen3-m... suggests that they think they're now at a point where it makes sense to scale up. (The Reuters article mentions data centers in several countries, which I assume also helps work around high GPU prices in China.)
Circling back to OpenAI: I don't think they're spending so much on infrastructure just because they want to train bigger models on more data, but moreso because they want to serve those bigger models to more customers using their services more intensively.