Is it not true that the instruction decoder is always active on x86, and is quite complex?
Such a decoder is vastly less sophisticated with AArch64.
That is one obvious architectural drawback for power efficiency: a legacy instruction set with variable word length, two FPUs (x87 and SSE), 16-bit compatibility with segmented memory, and hundreds of otherwise unused opcodes.
How much legacy must Apple implement? Non-kernel AArch32 and Thumb2?
Edit: think about it... R4000 was the first 64-bit MIPS in 1991. AMD64 was introduced in 2000.
AArch64 emerged in 2011, and in taking their time, the designers avoided the mistakes made by others.
There's no AArch32 or Thumb support (A32/T32) on M-series chips. AArch64 (technically A64) is the only supported instruction set. Fun fact: this makes it impossible to run Mario Kart 8 via virtualization on Macs without software translation, since it's A32.
How much that does for efficiency I can't say, but I imagine it helps, especially given just how damn easy it is to decode.
It actually doesn't make much difference: https://chipsandcheese.com/i/138977378/decoder-differences-a...
I had not realized that Apple did not implement any of the 32-bit ARM environment, but that cuts the legs out of this argument in the article:
"In Anandtech’s interview, Jim Keller noted that both x86 and ARM both added features over time as software demands evolved. Both got cleaned up a bit when they went 64-bit, but remain old instruction sets that have seen years of iteration."
I still say that x86 must run two FPUs all the time, and that has to cost some power (AMD must run three - it also has 3dNow).
Intel really couldn't resist adding instructions with each new chip (MMX, PAE for 32-bit, many more on this shorthand list that I don't know), which are now mostly baggage.
> I still say that x86 must run two FPUs all the time, and that has to cost some power (AMD must run three - it also has 3dNow).
Legacy floating-point and SIMD instructions exposed by the ISA (and extensions to it) don't have any bearing on how the hardware works internally.
Additionally, AMD processors haven't supported 3DNow! in over a decade -- K10 was the last processor family to support it.
Oh wow, I need to dig way deeper into this but wonderful resource - thanks!