Yep, this is the most exciting demo for me yet. Holy cow this is unbelievably fast.

The most impressive demo since gpt 3, honestly.

Since we already have open source models that are plenty good, like the new kimi k2.5, all I need is the ability to run it at moderate speed.

Honestly I am not bullish on capabilities that models do not yet have, seems we have seen it all and the only advancement have been context size.

And honestly I would claim this is the market sentiment aswell, anthropic showed opus 4.6 first and the big release was actually sonnet, the model people would use routinely. Nobody gave a shit about Gemini 3.1 pro, 3.0 flash was very successful...

Given all the recent developments in the last 12 months, no new use cases have opened for me. Given this insane speed, even on a limited model/context size, we would approach IA very differently.