How about we use all that AI and start doing some serious optimizations to existing software? Reduce memory requirements by half, or even more.

Plenty of people do.

AI is one of the few major general technological breakthroughs, comparable to the Internet and electricity. It's potentially applicable to everything, which is why right now everyone is trying to apply it to everything. Including developing new optimization algorithms, optimizing optimizing compilers, optimizing applications, optimizing systems, optimizing hardware, ...

Big AI vendors are at the forefront of it, because they're the ones who actually pay for the AI revolution, so any efficiency improvement saves them money.

> comparable to the Internet and electricity.

It will be when it actually exists.

> which is why right now everyone is trying to apply it to everything

And are any of them actually succeeding? Where are the new AI businesses? Where's the new wealth and money? Where's the one guy AI pioneer doing what used to take 100s?

> because they're the ones who actually pay for the AI revolution

Their customers do. The customers are getting ripped off. They want the AI revolution, what they got was a crappy search engine, and copyright whitewashing service instead.

Improving LLM memory contention will allow LLMs to use more memory.

We are.

I'm writing a metric ton of Rust code with Claude Code.

LLMs are intrinsically deaigned for token production, which is typically inversely related to optimization and efficoency.