I feel the same astonishment! Our computers surely are today faster and stronger and smaller than yesterdays', but did this really translate in something tangible for a user? I feel that besides boot-up, thanks to SSDs rather than gigaHertz, it's not any faster. It's like, all this extra power is used to the maximum, for good and bad reasons, but not focused on making 'it' faster. I get a bit puzzled to why my mac could freeze half a second when I 'cmd+a' in some 1000+ files-full folder.

Why doesn't Excel appear instantly, and why is it 2.29GB now when Excel 98 for Mac was.. 154.31MB? Why is a LAN transfer between two computers still as slow as 1999, 10ishMB/s, when both can simultaneously download at > 100MB/s? I'm not starting with GB-memory-hoarding tabs, when you think about it, it's managed well as a whole, holding 700+ tabs without complaining.

And what about logs? This is a new branch of philosophy, open Console and witness the era of hyperreal siloxal, where computational potential expands asymptotically while user experience flatlines into philosophical absurdity?

It me takes longer to install a large Mac program from the .dmg than it takes to download it in the first place. My internet connection is fairly slow and my disk is an SSD. The only hypothesis that makes sense to me is that MacOS is still riddled with O[n] or even O[n^2] algorithms that have never been improved and this incompetence has been made less visible by ever-faster hardware.

A piece of evidence supporting this hypothesis: rsync (a program written by people who know their craft) on MacOS does essentially the same job as Time Machine, but the former is orders of magnitude faster than the latter.