Apple always phases out these kinds of technologies after some time to keep the ecosystem tidy and give a last push to developers to abandon legacy code.
In this iteration, it might also allow some simplification of the silicon since Mx chips have some black magic to mimic x86 (mostly in memory access IIRC) to allow Rosetta to work that fast. IOW, Rosetta 2 is not a software only magic this time.
I remember using the first Rosetta to play Starcraft on my Intel Mac. It also got deprecated after a year or two.
So leaving things behind despite some pains is Apple's way to push people forward (e.g.: Optical media, ports, Rosetta 1, Adobe Flash, etc.).
If they hadn't deprecated 32 bit we would still be able to play Halo on mac.
This is the perfect comment because 1) it’s true, and 2) it can be read as supportive, a complaint, or just a neutral observation.
[dead]
The problem is, keeping older architectures alive creates an exponential workload, grinding everything to halt.
So, even though I feel what you are saying, we can't have every nice thing we want, at the same time.
What has been so impressive about the last 5 years of MacOS releases?
I'm still not sure what's so impressive about the last 25 years of Windows and MacOS that means we need an absolute supercomputer by 2000 standard just to open a word document the same way we did back in Windows 2000
Didn’t Word used to be installed from 2 floppy disks? Now Calculator.app leaks 40 GB of memory. Software in this sorry state cannot be run on a supercomputer, it needs one of those theoretical ocean-boilers.
Word 4.0 for DOS from 1987, sure.
This is a false memory. The reason "splash screens" existed with little text banners updating you about the status of the program's initializers was because it took fucking forever to launch Word on a 90's PC.
The steep decline in software stability and usability has been quite impressive, I wasn’t expecting them to screw it up so fast. Alan Dye in particular is a true inspiration for those who subscribe to the Peter Principle.
https://en.wikipedia.org/wiki/Peter_principle
I'm not very well versed in macOS internals, but I was a tech lead of a Debian derivative. I also write HPC software and manage relevant infrastructure from metal to user , so I believe I know some details about processor architectures, general hardware, Linux and *NIX systems in general.
The user-visible layer of an operating system is generally one of the simpler layers when it comes to code and maintain since it's build upon abstractions. However, the libraries powering these layers, esp. math-heavy and hardware-interacting ones are much more complex due to the innate complexity of the hardware in general.
Keeping multiple copies of a library, in two different architectures (even if it only changes in bit-length), where this simple bit-change needs different implementation strategies to work correctly is a pain by itself (for more information, ask Linux Kernel devs since they're also phasing out x86).
Moreover, x86 and x86_64 is a completely different mode on the processor. On top of that, x86 only mode is called "protected mode" and x86_64 is called "long mode", and running x86 under x86_64 is a sub-mode of "long mode", and is already complex enough at silicon level.
Same complexities apply to ARM and other processor architectures. Silicon doesn't care about the ISA much.
We have seen the effort of increasing performance on superscalar, out of order processors opened a new, untapped family of side-channel/speculative attacks already. So processors are complex, software is complex, and multiple architectures on the same hardware is exponentially complex. If you want to see how the sausages made, you can also research how Windows handles backwards compatibility problem (hint: by keeping complete Windows copies under a single Windows installation in ELI5 terms).
So, the impressive thing was making these multi-arch installations running for quite some time. We need to be able let things go and open some software and hardware budget for new innovations and improvements.
Addenda: Funnily, games are one of the harder targets for multi-arch systems since they are both math-heavy and somewhat closer to the hardware than most applications and are very sensitive to architecture changes. Scientific/computational software is also another family, and this interestingly contains databases and office software. Excel also had a nasty floating point bug back in time, and 32/64 bit installations of Microsoft Office has some feature differences since the beginning.
How much worse they make things.
ARM/Apple-Silicon support?
Apple's contempt for compatibility makes for poor game platforms. It's also a drain on developers who have a continual maintenance burden just to keep things running with each year's new edition of iOS.
Is there not an emulator at this point?
It's really hard to get normal people to deal with emulators so that you can build a community. And the original Halo allocated memory in a weird way that often screws things up.
Halo is a terrible example because it’s a game where the Mac version was never a very good way to play in the first place. I would guess that 99% of Halo players would be surprised to know it had a Mac version.
We should have a path to run legacy software when it’s practical but Halo is just not a good example to make that case.
I’d also personally be more interested in firing up the master chief collection or seeing if the upcoming campaign remake will be any good.
just as a little side note: while it really isn‘t regarded as a typical mac game nowadays, Halo was originally planned as a Mac-exclusive, before Microsoft bought Bungie. If memory servese me right, Jobs even presented it as an exclusive at some MacWorld or Expo.
Not sure it's only about tidiness. Rosetta 1 was licensed from a third party and Apple didn't want to keep paying the license fees.
I don't know if this is the situation with Rosetta 2.
I read a comment somewhere, possibly here by an ex-Apple engineer who claimed that they optimized the thing mathematically for the performance it exhibits.
So, considering its silicon parts, Rosetta 2 is more of an Apple endeavor and technology.
On the other hand 5-7 years a very typical timespan for Apple. So, I don't think licensing fees were that important while ending support for it.
The original Rosetta was based on technology from Transitive which, as I recall, IBM bought. Don't know where Rosetta 2 fits in and any licensing associated with the original Rosetta was a long time ago.
> It also got deprecated after a year or two.
It was five years, from 2006 to 2011. Rosetta 2 will have been there for seven years (currently at five).
https://en.wikipedia.org/wiki/Rosetta_(software)
To clarify, the complete sentence in my mind was "...after a year or two I got my Intel Mac". I got mine in Q3 2008, just before Unibody ones introduced.
So, I effectively got 2 years out of Rosetta 1, but didn't meant to say Apple supported it for two years only.
Sorry for the confusion.
Looks like I can't edit my comment anymore to clarify.