Moore's law has kind of ended already though, and maybe has done for a few years, and even if you can make a chip which is faster there's a basic thermodynamics problem running it at full tilt for any meaningful period of time. I would have expected that to have impacted software development, and I don't think it particularly has, and there's also no obvious gain in e.g. compilers or other optimization which would have countered the effect.
Probably architecture changes (x86 has a lot of historic baggage that difficults newer designs) and also more specialized hardware in the CPU, probably this might also be one of the reasons Apple went this way with the M Silicon