Not to mention that WindowServer seems to take 100+% cpu since the upgrade. Also I can't paste filenames in the save file dialog in some apps. And the URL field in Safari is just weird.

My computer was running so slowly that I had to minimize transparency in system preferences somewhere. I think I also turned off opening every app in its own space. And I hid the icons on the Desktop in Finder settings somehow, which helped a lot. There are countless other little tweaks that are worth investigating.

I also highly recommend App Tamer (no affiliation). It lets you jail background apps at 10% cpu or whatever. It won't help with WindowServer or kernel_task (which also often runs at 100+% cpu), but it's something.

I can't help but feel that there's nobody at the wheel at Apple anymore. When I have to wait multiple seconds to open a window, to switch between apps, to go to my Applications folder, then something is terribly wrong. Computers have been running thousands of times slower than they should be for decades, but now it's reaching the point where daily work is becoming difficult.

I'm cautiously optimistic that AI will let us build full operating systems using other OSs as working examples. Then we can finally boot up with better alternatives that force Apple/Microsoft/Google to try again. I could see Finder or File Explorer alternatives replacing the native ones.

> Computers have been running thousands of times slower than they should be for decades

I've been hearing this complaint for decades and I'll never understand it. The suggestion seems completely at odds with my own experience. Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.

I remember a time when I could visually see the screen repaint after minimizing a window, or waiting 3 minutes for the OS to boot, or waiting 30 minutes to install a 600mb video game from local media. My m2 air with 16gb of memory only has to reboot for updates, I haphazardly open 100 browser tabs, run spotify, slack, an IDE, build whatever project I'm working on, and the machine occasionally gets warm. Everything works fine, I never have performance issues. My linux machines, gaming pc, and phone feel just as snappy. It feels to me that we are living in a golden age of computer performance.

I think the best example is in iOS. On old iOS versions, the keyboard responsiveness took precedence over everything, no matter what. If you touched the keyboard, it would respond with an animation indicating what you are doing. The app itself may be frozen, but the self contained keyboard process would continue on, letting you know the app you are using is a buggy mess.

Now in iOS 26, you can just be typing in Notes or just the safari address bar for example, and the keyboard will randomly lag behind and freeze, likely because it is waiting on some autocomplete task to run on the keyboard process itself. And this is on top of the line, modern hardware.

A lot of the fundamentals that were focused on in the past to ensure responsiveness to user input was never lost, became lost. And lost for no real good reason, other than lazy development practices, unnecessary abstraction layers, and other modern developer conveniences.

Yeah long ago when I was doing some iOS development, I can remember Apple UX responsives mantras like “don’t block the main thread”, as it’s the thing responsible for making app UIs snappy even when something is happening.

Nowadays seems like half of Apple’s own software blocks on their main thread, like you said things like keyboard lock up for no reason. God forbid you try to paste too much text into a Note - the paste will crawl to a halt. Or, on my M4 max MacBook, 128GB ram, 8tb ssd, Photos library all originals saved locally - I try to cmd-R to rotate an image - the rotation of a fully local image can sometimes take >10 seconds while showing a blocking UI “Rotating Image…”, it’s insane how low the bar has dropped for Apple software.

This trend was obvious when they started removing physical buttons. My thought was, man these people do put so much faith in software.

My M4 Max 128GB ... 90% of the time is like you say.

10% of the time, Windowserver takes off and spends 150% CPU. Or I develop keystroke lag. Or I can't get a terminal open because Time Machine has the backup volume in the half mounted state.

It's thousands of times faster than the Ultra 1 that was once on my desk. And I can certainly do workloads that fundamentally take thousands of times more cycles. But I usually spend a greater proportion of this machine's speed on the UI and responsiveness doesn't always win over 30 years ago.

Or contactsd lol

Spotlight doesn’t make sense either.. caches get evicted, but there’s no logic that prevents it from building it back up immediately

Log processes are fine, but they should never be able to use 100% / At the same priority (cpu+io)

Ok. Today we have multi-Ghz processors, with multiple cores at that.

Photons travel about 1 foot per nanosecond ... so the CPU can executes MANY instructions between the time photons leave your screen, and the time they reach your eyes.

Now, on Windows start Word (on a Mac start Writer) ... come on ... I'll wait.

Still with me? Don't blame the SSD and reload it again from the cache.

Weep.

Not sure where you're getting at. MS Word, full load to ready state after macOS reboot takes ~ 2 seconds on my M1 mac. If I close and re-open it (so it's on fs cache) is takes about ~1 second.

You, and sibling comment author just never experienced the truly responsive ui.

It is one where reaction is under a single frame from action. EDIT: and frame is 1/60s, that is 16.(6)ms. I feel bad feeling I have to mention this basic fact.

This was possible on 1980s hardware. I witnessed that, I used that. Why is it not possible now?

Base model M4 Mac Mini -- takes 2 seconds to load Word (and ready to type) without it being cached. Less than 1 second if I quit it completely, and launch again, which I assume is because it's cached in RAM.

>Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.

One analogy is that the distance between two places in the world hasn't changed, but we're not arriving significantly faster than we before modern jetliners were invented. There was a period of new technology followed by rapid incremental progress toward shortened travel times until it leveled off.

However, the number of people able to consistently travel between more places in the world has continued to increase. New airports open regularly, and airliners have been optimized to fit more people, at the cost of passenger comfort.

Similarly, computers, operating systems, and their software aren't aligned in optimizing for user experience. Until a certain point, user interactions on MacOS took highest priority, which is why a single or dual core Mac felt more responsive than today, despite the capabilities and total work capacity of new Macs being orders of magnitude higher.

So we're not really even asking for the equivalent of faster jet planes, here, just wistfully remembering when we didn't need to arrive hours early to wait in lines and have to undress to get through security. Eventually all of us who remember the old era will be gone, and the next people will yearn for something that has changed from the experiences they shared.

> Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.

This very much depends on what hardware you have and what you're doing on it (how much spare capacity you have).

Back in university I had a Techbite Zin 2, it had a Celeron N3350 and 4 GB of LPDDR4. It was affordable for me as a student (while I also had a PC in the dorm) and the keyboard was great and it worked out nicely for note taking and some web browsing when visiting parents in the countryside.

At the same time, the OS made a world of difference and it was anything but fast. Windows was pretty much unusable and it was the kind of hardware where you started to think whether you really need XFCE or whether LXDE would be enough.

I think both of the statements can be true: that Wirth's law is true and computers run way, way slower than they should due to bad software... and that normally you don't really feel it due to us throwing a lot of hardware at the problem to make us able to ignore it.

It's largely the same as you get with modern video game graphics and engines like UE5, where only now we are seeing horrible performance across the board that mainstream hardware often can't make up for and so devs reach for upscaling and framegen as something they demand you use (e.g. Borderlands 4), instead of just something to use for mobile gaming.

It's also like running ESLint and Prettier on your project and having a full build and formatting iteration take like 2 minutes without cache (though faster with cache), HOWEVER then you install Oxlint and Oxfmt and are surprised to find out that it takes SECONDS for the whole codebase. Maybe the "rewrite it in Rust" folks had a point. Bad code in Rust and similar languages will still run badly, but a fast runtime will make good code fly.

I could also probably compare the old Skype against modern Teams, or probably any split between the pre-Electron and modern day world.

Note: runtime in the loose sense, e.g. compiled native executables, vs the kind that also have GC, vs something like JVM and .NET, vs other interpreters like Python and Ruby and so on. Idk what you'd call it more precisely, execution model?

> Regardless of OS, they all seem extremely fast, and feel faster and faster as time goes on.

The modern throughput is faster by far. However, what some people mean when they talk about "slower" is the latency snappiness that characterizes early microcomputer systems. That has definitely gotten way worse in an empirically measurable fashion.

Dan Luu's article explains this very well [1].

It is difficult today to go through that lived experience of that low latency today because you don't appreciate it until you lived it for years. Few people have access to an Apple ][ rig with a composite monitor for years on end any longer. The hackers that experienced that low latency never forgot it, because the responsiveness feels like a fluid extension of your thoughts in a way higher latency systems cannot match.

[1] https://danluu.com/input-lag/*

I wonder if this ties into why I'm baffled at the increasing trend of adding fake delays (f/ex "view transitions"). It's maddening to me. It's generally not a masking/performance delay either; I've recompiled a number of android apps for example to remove these sorts of things, and some actions that took an entire second to complete previously happen instantly after modification.

Have you tried disabling animations in the System Settings? Some apps respond to that.

Ohhhh trust me, I have, assuming you mean "Disable animations". The three duration scale developer settings too. Thank you for suggesting it, though, just in case.

Some apps do respect it, but sometimes it's hardcoded, and OS settings don't seem to override it. Even the OS doesn't respect it in some cases, but I think it used to. Flutter apps? Forget about it.

I think we're already seeing the operating systems that AI can build, and I don't think they've been an improvement.

The fact microsoft keeps messing up my Windows 11 gaming desktop, I think you're right.

Gaming is one more area where Linux has surpassed Windows. I'm using Valve's Proton to run Windows games on Linux and it's smooth as butter.

Yeah if I wasn't a Valorant addict (and competing in Contender, pushing for Invite if we win this season) I'd drop it for Linux entirely.

Yes, except for the odd anti-cheat that needs kernel level access

> Not to mention that WindowServer seems to take 100+% cpu since the upgrade

That's because some app is spamming window updates.

It's been an ongoing problem for many releases. AFAICT, WindowServer 100% CPU is a symptom, not a cause.

But apps shouldn't be able to hammer WindowServer in the first place. If your app is misbehaving, your app should hang, not the OS window compositor!

FWIU there's really no backpressure mechanism for apps delegating compositing (via CoreAnimation / CALayers) to WindowServer which is the real problem IMO.

And I could imagine SwiftUI only makes this worse, because it's quite easy to trigger tons of unnecessary redraws.

People don't really like apps that stutter.

And maybe that would get enough users to leave or complain that managers might allow some dev time to fix bad behavior.

Symptoms with no way to understand why.

If Apple would give insight about this, the developers wold get bug reports and complaints

Similar to the electron shit

QubesOS seems a great migration target: it runs Apps/OS in secure sandboxes - and even with that overhead doesn't seem worse than the terrible MacOS 26 performance.

I think suggesting QubesOS to someone is coming from macOS is a really bad choice.

> I'm cautiously optimistic that AI will let us build full operating systems using other OSs as working examples.

Why? No one has shown that LLMs produce particularly good code. You can get a lot of useful shit done with what is still slop, but there is no reason to believe there's any evolutionary improvement.

Kernel_task is often the os thermal throttling, when was the last time you hit the vents with a can of compressed air?

Nobody's been at the wheel for a while, it's just not a race car, it's a barge.

Yeah this is my actual issue with Tahoe. It blows my mind people keep bringing up the corners when WindowServer is complete trash now.

kernel_task using 100% is the system thermally throttling and the OS spamming NOPs to cool the CPU down

This is usually not the case.

I don't know much about CPU internals, but this sounds like bullshit to me. A NOP is still an instruction that uses a cycle - why should that cool the CPU down? The CPU frequency should get reduced to lower the power consumption and hence the temperature.

Not all cycles cost the same amount of power. (Not that you would want to spam nops for thermal management, you should idle the core with a pause etc that actually tells the processor what you are trying to do.)

It used to be the case with intel macs and their atrocious confluence of cooling system, thermals, and power supply system (the CPU actually was not really to blame).

But when RAPL and similar tools to throttle CPU are used, the CPU time gets reported as kernel_task - on linux it would show similarly as one of the kernel threads.

Any way to see that?

>I'm cautiously optimistic that AI will let us build full operating systems using other OSs as working examples.

Even if that would be possible, you can't run commercial software. And for many people, the software they run is more important than the OS.

I'm very worried one day Apple will start enforcing upgrades to Tahoe just like Microsoft is doing with Windows.

[dead]

Is this a case of "It Just Works" or "You're Holding It Wrong"?