That's a future paid for by the effort of creating current frameworks, and it's a stagnant future where every "sophisticated system" is just re-hashing the last human frameworks ever created.
That's a future paid for by the effort of creating current frameworks, and it's a stagnant future where every "sophisticated system" is just re-hashing the last human frameworks ever created.
Bingo. LLMs are consuming data. They cannot generate new information, they can only give back what already exists or mangle it.
It is inevitable that they will degrade the total sum of information.