and anyone notice that the pace has broken xAI and they were just dropped behind? The frontier improvement release loop is now ant -> openai -> google

xAI just released Grok 4.20 beta yesterday or day before?

Musk said Grok 5 is currently being trained, and it has 7 trillion params (Grok 4 had 3)

My understanding is that all recent gains are from post training and no one (publicly) knows how much scaling pretraining will still help at this point.

Happy to learn more about this if anyone has more information.

You gain more benefit spending compute on post-training than on pre-training.

But scaling pre-training is still worth it if you can afford it.