Seems premature. My scanner software, SnapScan, still regularly updated, requires Rosetta. Abbyy FineReaser, the best Mac OCR, requires Rosetta. Although they may be related, as the SnaScan software does OCR with the FineReader engine.

The M1 chip and Rosetta 2 were introduced in 2020. macOS 28 will be released in 2027. 7 years seems like plenty of time for software vendors to make the necessary updates. If Apple never discontinues Rosetta support, vendors will never update their software to run natively on Apple chips.

This is also consistent with Apple’s previous behavior with backwards compatibility, where Apple would provide a few years of support for the previous platform but will strongly nudge developers and users to move on. The Classic environment in Mac OS X that enabled classic Mac OS apps to run didn’t survive the Intel switch and was unavailable in Leopard even for PowerPC Macs, and the original Rosetta for PowerPC Mac OS X applications was not included starting with Lion, the release after Snow Leopard.

Honestly, for apple this is above and beyond. They've killed support with less fanfare and compatibility support than what we see here.

Bully on me for owning hardware and expecting it to behave consistently across OTA updates.

The hardware isn't (as far as I'm aware) changing. Please don't move the goalposts for hardware ownership (we just be able to do with our hardware as we please) to also include indefinite support from vendors. That just makes us looks like childish crybabies.

If you were instead asking for hardware documentation, or open-sourcing of Rosetta once sunset, then we're on the same team.

I never asked for an infinite window of software support, though. I merely want the features that I had when I bought the laptop, for as long as the OS supports my machine. The response is always "blame the third-parties" when apps break, but oftentimes the devs already made their money and moved on. The onus is on Apple to support their OS' software if they want to have my money.

Open-sourcing is one solution, but knowing Apple it's not a likely one. Their "we know best" mindset is why I quit dailying Macs entirely - it's not sustainable outside the mobile dev business. A computer that supports 32-bit binaries, OpenGL or x86 translation when you bought it should be able to retain that capability into the future. Anything less is planned obselecense, even if you want to argue there's a silver lining to introducing new tech. New tech should be competitive on-merits, not because it's competitor was forcibly mutilated.

> The onus is on Apple to support their OS' software if they want to have my money

Apple has done this exact same thing for every architecture change and every API they sunset, but you gave them your money anyways. Their history with discontinuing software support and telling users to harang third-party devs isn't exactly a secret.

Sure, then don't update.

I think you probably should not buy Apple hardware. It is not a guarantee they have ever offered that their software would behave consistently across updates. If this mattered to me, I would have done some research and rapidly found out that Apple has done this every few years for the last 30 years.

But their new hardware is so good though, it's kind of hard to pass up

At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?

I doubt such a thing has ever happened in the history of consumer-facing computing.

Have you ever heard of Windows? Unlike Apple, they do care about backwards compatibility, and don’t randomly go removing features users depend on.

Also they aren't the only ones, this is pretty much standard across commercial UNIXes (the survivors), embedded OSes, mainframes and micros.

and the consequences are dire

Are they? IMO Windows going downhill has more to do with what is being added to it than what it is preserving compatibility for.

> At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?

> I doubt such a thing has ever happened in the history of consumer-facing computing.

Come on. I've done that and still do: I use an ancient version of Adobe Acrobat that I got with a student discount more than 10 years ago to scan documents and manipulate PDFs. I'd probably switch to an open source app, if one were feature comparable, but I'm busy and honestly don't have the time to wade through it all (and I've got a working solution).

Adobe software is ridiculously overpriced, and I'm sure many, many people have done the same when they had perpetual-use licenses.

> At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?

Linux users do it all the time with WINE/Proton. :-)

Before you complain about the term 'major OEM operating system'; Ubuntu is shipped on major OEMs and listed in the supported requirements of many pieces of hardware and software.

> I doubt such a thing has ever happened in the history of consumer-facing computing.

Comments like this show how low standards have fallen. Mac OS X releases have short support lengths. The hardware is locked down-you need a massive RE effort just to get Linux to work. The last few gens of x86 Mac hardware did not have as much, but it was still locked down. M3 or M4 still do not have a working installer. None of this is funded by Apple to get it working on Linux or to get Windows ARM working on it as far as I know.

In comparison, my brother in-law found an old 32bit laptop that had Windows 7. It forced itself without his approval to update to Windows 10. It had support for 10 years from Microsoft with just 10. 7 pushed that 10 to... hmm... 13+ years of support?

> Linux users do it all the time with WINE/Proton. :-)

And there’s a near 100% chance you’ll have to recompile/download pre-re-compiled binaries if moving to a completely different architecture. Same here.

Not the same here. The user didn't have to get different binaries when they changed hardware, and that was a big selling point for the hardware. And now it's going to break in an arbitrary software update.

They did if they moved from x86 to ARM.

There's some kind of massive communication breakdown or misunderstanding going on.

In "Not the same here”, "here" is people moving to M1. So no they very much didn't, that was the whole point of rosetta 2.

Not sure what you are saying. If you saying you need the gamedev to recompile for arm you can run a virtualization layer, just like Mac and Windows. My friend has had the best results with: https://fex-emu.com/

> At what point in history have you owned a particular piece of hardware [...] and installed a major OEM operating system release a full 7 years after release without issue?

A few years ago, I installed Windows 10 on a cheap laptop from 2004—the laptop was running Windows XP, had 1GB of memory, a 32-bit-only processor, and a 150GB hard drive. The computer didn't support USB boot, but once I got the installer running, it never complained that the hardware was unsupported.

To be fair, the computer ran horrendously slow, but nothing ever crashed on me, and I actually think that it ran a little bit faster with Windows 10 than with Windows XP. And I used this as my daily driver for about 4 months, so this wasn't just based off of a brief impression.

Windows 95 was released... well, in 1995. In 2025 you can run apps targeting W95 just fine (and many 16-bit apps with some effort)

> In 2025 you can run apps targeting W95 just fine (and many 16-bit apps with some effort)

FWIW, Windows running on a 64-bit host no longer runs 16-bit binaries.

Yes. Still, there are ways to do it anyway, from Dosbox to WineVDM. Unlike MacOS where having even 32 bit app (e.g. half of Steam games that supported Macos to begin with) means you're fucked

You can use dosbox and x86 virtual machines just fine in macOS (with the expected performance loss) right now, without Rosetta. macOS is still Turing complete.

Technically speaking, you can run anything on anything since stuff Turing complete. Practically speaking however....

E.g. i have half of macos games in my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed. Best way to do it is to ditch macos version altogether and emulate win32 version of the game (witch will run at reasonable speed via wine forks). Somehow Win32 api is THE most stable ABI layer for linux & mac

> my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed.

To be fair, it's the emulation of x86-32 with the new ARM64 architecture that causes the speed problems. That transition is also why MacBooks are the best portables, in terms of efficiency, that you can buy right now.

All ARM chips have crippled x86-32 performance, because they're not x86-32 chips. You'll find the same (generally worse) performance issues trying to run ARM64 code with x86-64.

Rosetta 2 is pretty good at running x86-32. There's more registers on the destination, after all.

>Windows running on a 64-bit host no longer runs 16-bit binaries.

Which isn't an issue since Windows 95 was not a 16-bit OS, that was MS-DOS. For 16-bit DOS apps there's virtualization things like DOSbox or even HW emulators.

[deleted]

This isn't a new or unique move; Apple has never prioritized backwards compatibility.

If you're a Mac user, you expect this sort of thing. If running neglected software is critical to you, you run Windows or you keep your old Macs around.

It's a bizarre assumption that this is about "neglected software."

A lot of software is for x64 only.

If Rosetta2 goes away, Parallels support for x64 binaries in VMs likely goes away too. Parallels is not neglected software. The x64 software you'd want to run on Parallels are not neglected software.

This is a short-sighted move. It's also completely unprecedented; Apple has dropped support for previous architectures and runtimes before, but never when the architecture or runtime was the de facto standard.

https://docs.parallels.com/parallels-desktop-developers-guid...

Paralles x86_64 emulation doesn't depend on Rosetta.

[deleted]

> If Rosetta2 goes away, Parallels support for x64 VMs likely goes away too.

Rosetta 2 never supported emulating a full VM, only individual applications.

You're right. It looks like the new full VM emulation in 20.2 doesn't use Rosetta.

https://www.parallels.com/blogs/parallels-desktop-20-2-0/

Nevertheless, running x64 software including Docker containers on aarch64 VMs does use Rosetta. There's still a significant valid use case that has nothing to do with neglected software.

Edited my post above. Thanks for the correction.

The OP only applies to Rosetta for running x64 Mac apps, not running x64 Linux software in aarch64 Linux VMs.

I seem to remember 68k software working (on PowerPC Macs) until Classic was killed off in Leopard? I'm likely misremembering the length of time, but it seems like that was the longest backwards-compatibility streak Apple had.

There's a lot of Win95 software that you can't run too. Microsoft puts a lot of work into their extensive backlog of working software. It's not just "good engineering" it's honest to god fresh development.

Just because Microsoft does one thing doesn't mean Apple has to do the same.

That's not a good thing for other reasons; e.g. there are a lot of inconsistencies in modern Windows, like pieces of Windows 3.1 still in Windows 11.

There are leftovers from older versions of macOS and severely neglected apps in Tahoe too. Sure, they might have been given a new icon, or adopted the new system styling, but they have not been updated for ages.

That's not necessarily a good thing.

The main problem is not native software, but virtualization, since ARM64 hardware is still quite uncommon for Windows/Linux, and we need Rosetta for decent performance when running AMD64 in virtual machines.

There is lots of existing software (audio plugins, games, etc.) that will never see an update. All of that software will be lost. Most new software has ARM or universal binaries. If some vendors refuse to update their software, it's their problem. Windows still supports 32-bit applications, yet almost all new software is 64-bit.

I think this is exactly what they're issuing this notice to address. Rosetta performs so well that vendors are pretty okay just using it as long as possible, but a two year warning gives a clear signal that it's time to migrate.

If it's ok now then what's even the problem with letting it be?

One problem from Apple’s perspective is that it continues to cost them money to maintain both the translation layer and the x86_64 frameworks on an ongoing basis.

I mean, is it really an excessive burden to keep a "too popular" feature alive for users? Features users pay for cost money to build and maintain. These aren't unique situations.

It would be different if the feature wasn't popular at all but that doesn't seem to be the case.

It doesn't seem especially popular to me, so... citation needed? It's not being discontinued for being too popular, that's for sure.

Apple doesn't want to maintain it forever, and a handful of legacy apps will never be bothered to update to native Apple Silicon support unless it means losing access to their user base. Apple has given them plenty of time to do it naturally, and now Apple is giving them a stronger reason and a couple more years to get it done. Apple is not randomly discontinuing it with no notice; two years is plenty of time for maintained software to get over the finish line.

At the end of the day, Apple doesn't want to pay to maintain this compatibility layer for forever, and Apple's customers will have a better experience in the long run if the software they are using is not running through an extra translation layer.

There will always be some niche users who want this feature to remain forever, but it's clearly not a significant enough percentage of users for Apple to be worried about that, or else Apple would maintain it forever.

I usually agree with Apple but I don't agree with this. Rosetta 28 is basically magic, why would they take away one of their own strongest features? If they want big name apps to compile to Apple Silicon, why can't they exert pressure through their codesigning process instead?

The “big name apps” have already moved to Apple Silicon. Rosetta helped them with that process a few years ago. We’re down to the long tail apps now. At some point, Rosetta is only helping a couple people and it won’t make sense to support it. I just looked, and right now on my M1 Air, I have exactly one x86 app running, and I was honestly surprised to find that one (Safari plug-in). Everything else is running ARM. My workload is office, general productivity, and Java software development. I’m sure that if you allow your Mac to report back app usage to Apple, they know if you’re using Rosetta or not, and if so, which apps require it. I suspect that’s why they’re telegraphing that they are about ready to pull the plug.

How do you check if you're running any x86 apps?

1. From the Apple menu, click "About This Mac."

2. In the resulting window, click the "More Info..." button. This will open the System Settings window.

3. Scroll to the bottom of that window and click "System Report."

4. In the left side of the resulting window, under "Software," click "Applications." This will provide a list of installed applications. One of the columns for sorting is "Kind"; all apps that are x86 will be listed with the kind, "Intel."

You can replace steps 1–3 with “Open /System/Applications/Utilities/System Information.app”.

Does the MacOS shell not split at spaces, or how does that work without quotation?

That’s not a shell command (otherwise “open” would be lowercase), it’s just an instruction.

Can you explain where you type that for non macOS users?

No it's an instruction for humans

Funny, that didn't occur to me at all. :-/ Maybe that's because I'm used to the FHS and I expected a path starting with /System to be either a path that is interpreted by some command or a description of an UI flow, not a path in the file system. So the thing you would type in the shell is just '/System/Applications/Utilities/System Information.app'? Does the Finder support starting programs by typing the path in the pathbar, like MS Explorer on Windows?

Adobe Acrobat, Steam, and PDF Reader Pro...

To see what’s running,

1. Go into Activity Monitor

2. From the CPU or memory tab, look at the “Kind” column. It’ll either say “Apple” or “Intel.” If the Kind column isn’t visible, right-click on the column labels and select Kind.

In macOS 26, you can see every Rosetta app that has recently run on your machine by going to System Information and then Software / Rosetta Software. It includes the "Fallback Reason" (e.g. if you manually forced the app under Rosetta or if it was an Intel-only binary).

FWIW, I have zero Rosetta apps on my M1 laptop and I've been a Mac user since the earliest days.

I'm super aware of the issues involved--I oversaw the transition from PPC to Intel at a university back in the day, using OG Rosetta. Even then, we had users who would only stop using their PPC apps when you took them from their cold, dead hands.

There's this Silicon app that scans your disk for them: https://github.com/DigiDNA/Silicon.

How much die area does it use that could be used for performance? How much engineering time does it use? Does it make sense to keep it around, causing ~30% more power usage/less performance?

There are many acceptable opposing answers, depending on the perspective of backwards compatibility, cost, and performance.

My naive assumption is that, by the time 2027 comes around, they might have some sort of slow software emulation that is parity to, say, M1 Rosetta performance.

Rosetta is a software translation layer, not a hardware translation layer. It doesn't take any die space.

Hardware acceleration [1]:

> One of the key reasons why Rosetta 2 provides such a high level of translation efficiency is the support of x86-64 memory ordering in the M1 SoC. The SoC also has dedicated instructions for computing x86 flags.

[1] https://en.wikipedia.org/wiki/Rosetta_(software)

While true, we're not talking about the chips losing TSO; Apple plans to keep Rosetta 2 for games and it has to remain fast because, well, it's video games. It also seems like they plan to keep their container tool[1]. This means they can't get rid of TSO at the silicon level and I have not heard this discussed as a possibility. We're only discussing the loss of the software support here. The answer to "How much die area does it use that could be used for performance?" is zero--they have chosen to do a partial phase-out that doesn't permit them to save the die space. They'd need to kill all remaining Rosetta 2 usage in order to cull the die space, and they seem to be going out of their way not to do this.

[1] https://github.com/apple/container -- uses Rosetta translation for x64 images.

> We're only discussing the loss of the software support here

Schematically "Rosetta 2" is multiple things:

- hardware support (e.g TSO)

- binary translation (AOT + JIT)

- fat binaries (dylibs, frameworks, executables)

- UI (inspector checkbox, arch(1) command, ...)

My bet is that beyond the fancy high-level "Rosetta 2" word what will happen is that they'll simply stop shipping fat x86_64+aarch64 system binaries+frameworks[0], while the remainder remains.

[0]: or rather, heavily cull

So, the way to "use die area for performance" is to add more cache and branch predictor space. Because of this, anything that costs a lot of code size does consume it because it's using the cache up.

> Rosetta is a software translation layer, not a hardware translation layer. It doesn't take any die space.

There is hardware acceleration in place that that only exists for it to, as you just stated, give it acceptable performance.

It does take up die space, but they're going to keep it around because they've decided to reduce the types of applications supported by Rosetta 2 (and the hardware that it exists only for it) will support.

So, seems like they've decided they can't fight the fact that gaming is a Windows thing, but there's no excuse for app developers.

Sure, this seems to be a restatement of my post, which started with "While true...", rather than a disagreement. I was pointing out which one of the "many acceptable opposing answers" Apple had chosen. They can't use that die area for performance because they're still using it even after this phase-out. (I'm not the person who wrote the original post.)

You can most likely use Vuescan, I use that with an old ScanSnap i500 (or something)

[1] https://www.hamrick.com

I have Vuescan and it’s not even close.

Love VueScan for my film scanner!

I spent what I would consider to be a lot of money for a unitasker Fujitsu scanner device and am just astounded by how unmaintained and primitive the software is. I only use it on a Windows machine though, so I'm not in the same boat.

They were pretty quick to sunset the PPC version of Rosetta as well. It forces developers to prioritize making the change, or making it clear that their software isn’t supported. It

The one I have my eye on is Minecraft. While not mission critical in anyway, they were fairly quick to update the game itself, but failed to update the launcher. Last time I looked at the bug report, it was close and someone had to re-open it. It’s almost like the devs installed Rosetta2 and don’t realize their launcher is using it.

Rosetta for PPC apps was supported from the first Intel Macs released in January 2006 until 10.7 Lion was released in July 2011.

So just over five years? If Apple phase out Rosetta 2 in macOS 28, then it will have been supported for seven years.

Owning a Mac has always meant not relying on 3P software. Forget printer/scanner drivers. Even if they target macOS perfectly, there will come a day when you need to borrow a Windows PC or old Mac to print.

It happens to be ok for me as a SWE with basic home uses, so their exact target user. Given how many other people need their OS to do its primary job of running software, idk how they expect to gain customers this way. It's good that they don't junk up the OS with absolute legacy support, but at least provide some kind of emulation even if it's slow.

QEMU will still be an option. Albeit not the fastest or easiest option compared to Rosetta 2.

Me, too. Would be horrible to lose access to my scanner. I have no faith in Fujitsu tgat they would support my iX500.

This is Apple's "get your shit together and port to ARM64, you have 2 years" warning.

If you're not willing to commit to supporting the latest and greatest, you shouldn't be developing for Apple.