Once the world is all run on AI generated code how much memory and cpu cycles will be lost to unnecessary code? Is the next wave of HN top stories “How we ditched AI code and reduced our AWS bill by 10000%”?

I don't know but the current situation is already so absurd that AI probably won't make it much worse. It can even make it a little better. I am talking about AI generated "classical" code, not the AIs themselves.

Today, we are piling abstraction on top of abstractions, culminating with chat apps taking a gigabyte of RAM. Additional getters and setters are nothing compared to it, maybe literally nothing, as these tend to get optimized out by the compiler.

The way it may improve things is that it may encourage people to actually code a solution (more like having it AI generated) rather than pulling an big library for a small function. Both are bad, but from an efficiency standpoint by being more specialized code, the AI solution may have an edge.

Note that this argument is only about runtime performance and memory consumption, not matters like code maintainability and security.