> Actually it is not about this stage. It is about the sustainability of this when training data runs out
This is an argument from 2024. Somehow, the models have continued to improve.
If they stopped improving today they are good enough as they already are to generate profound change.
The wave front is already visible, we’re just on the shore waiting for the impact.
When training data runs out, they usefulness will diminish quickly. They will still be useful for searching documents etc, but I guess they are not good at that even now.
When training data runs out, their usefulness will stop growing quickly. Why should their usefulness diminish?
Because they would not be up-to date with programming languages, tools, best practices etc.
May be there is some way to keep the model up-to date in less dramatic ways. But I think something gotto give..
I mean, even now the vibe coded stuff is reprehensible.