I've been a Mac user on and off since the 80s and I think one of the biggest changes is how separate the Mac ecosystem once was.
It reminds me of stories I've heard about the Cold War and how Soviet scientists and engineers had very little exchange or trade with the West, but made wristwatches and cameras and manned rockets, almost in a parallel universe. These things coexisted in time with the Western stuff, but little to nothing in the supply chain was shared; these artifacts were essentially from a separate world.
That's how it felt as a Mac user in the 80s and 90s. In the early days you couldn't swap a mouse between a Mac and an IBM PC, much less a hard drive or printer. And most software was written pretty much from the ground up for a single platform as well.
And I remember often thinking how much that sucked. My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
Now so much has been standardized - everything is USB or Wifi or Bluetooth or HTML or REST. Chrom(ium|e) or Firefox render pages the same on Mac or Windows or Linux. Connect any keyboard or webcam or whatever via USB. Share files between platforms with no issues. Electron apps run anywhere.
These days it feels like Mac developers (even inside of Apple) are no longer a continent away from other developers. Coding skills are probably more transferable these days, so there's probably more turnover in the Apple development ranks. There's certainly more influence from web design and mobile design rather than a small number of very opinionated people saying "this is how a Macintosh application should work".
And I guess that's ok. As a positive I don't have the cross-platform woes anymore. And perhaps the price to be paid is that the Mac platform is less cohesive and more cosmopolitan (in the sense that it draws influence, sometimes messily, from all over).
> It reminds me of stories I've heard about the Cold War and how Soviet scientists and engineers had very little exchange or trade with the West, but made wristwatches and cameras and manned rockets, almost in a parallel universe
They also had an extensive industrial espionage program. In particular, most of the integrated circuits made in the Soviet Union were not original designs. They were verbatim copies of Western op-amps, logic gates, and CPUs. They had pin- and instruction-compatible knock-offs of 8086, Z80, etc. Rest assured, that wasn't because they loved the instruction set and recreated it from scratch.
Soviet scientists were on the forefront of certain disciplines, but tales of technological ingenuity are mostly just an attempt to invent some romantic lore around stolen designs.
> They were verbatim copies of Western op-amps, logic gates, and CPUs.
DEC etched a great Easter egg on to the die of the MicroVAX CPU because of this: "VAX - when you care enough to steal the very best".
https://micro.magnet.fsu.edu/creatures/pages/russians.html
> tales of technological ingenuity are mostly just an attempt to invent some romantic lore around stolen designs.
This is a biased take. One can make a similar and likely more factual claim about the US , where largely every innovation in many different disciplines is dictated and targeted for use by the war industry.
And while there were many low quality knockoff electronics, pre-collapse USSR achieved remarkable feats in many different disciplines the US was falling behind at.
https://en.wikipedia.org/wiki/Timeline_of_Russian_innovation...
> One can make a similar and likely more factual claim about the US , where largely every innovation in many different disciplines is dictated and targeted for use by the war industry.
That's a complete non-sequitur.
>where largely every innovation in many different disciplines is dictated and targeted for use by the war industry.
As opposed to the USSR who's wikipedia page for innovations proudly features, lets see;
Aerial Refueling
Military robot Paratrooping
Flame tank
Self-propelled multiple rocket launcher
Thermonuclear fusion (bomb)
AK-47
ICBMs
Tsar Bomb
to name a very small selection
It's almost as if you have it completely backwards and it was the USSR who was centrally planning to innovate in the art of killing.
I don't know if you deliberately skipped the 90% of other inventions that had nothing to do with -mind you- defense from American imperialism or you're being dense on purpose. Probably both?
Anyway just glancing the respective page for US "innovations" one can easily tell which country had the most obsessive offensive war industry.
There was a Star Talk recently where they talked about how when they divided up the German aerospace scientists after WWII, Russia ended up with majority KISS scientists and we got the perfectionist, superior engineering ones. I always figured that was just a US vs Russia ethos difference. And maybe that’s why they picked who they did but maybe I have it backward.
That seems completely unbelievable to me, of the thousands (tens of thousands?) of scientists captured and recruited by the allies they just happened to split along philosophical lines? And then they had some huge cultural impact? As opposed to just being Shanghai'd by whatever nation got to them first then absorbed into the greater social and economic fabric of that nation.
I always assumed it was just which army captured them.
Yes, and many German scientists went to great lengths to surrender to Western forces. I think von Braun was one of them.
“Once the rockets are up, who cares where they come down?
That's not my department!" says Wernher von Braun
Seems analogous to Apple and Microsoft in the 80s and 90s. Though I'm not sure which country Xerox would be. Maybe Germany in terms of the technology lifted by the later powers, but it seems a like a bit of a rude comparison!
https://en.wikipedia.org/wiki/Pirates_of_Silicon_Valley
I think apple had a trajectory, and the best time was at the end of the steve jobs era. After he left, they have plummeted.
I think they were in their own little world, and when they got past that with unix-based OSX and moved from powerpc to intel, they entered the best time.
The PC-based macs were very interoperable and could dual-boot windows. They had PCIe and could work with PC graphics cards, they used usb bt and more. Macs intereoperated and cooperated with the rest of the computing world. The OS worked well enough that other unix programs with a little tweaking could be compiled and run on macs. Engineers, tech types and scientists would buy and use mac laptops.
But around the time steve jobs passed away they've lost a lot of that. They grabbed control of the ecosystem and didn't interoperate anymore. The arm chips are impressive but apple is not interoperating any more. They have pcie slots in the mac pro, but they aren't good for much except maybe nvme storage. without strong leadership at the top, they are more of a faceless turn-the-crank iterator.
(not that I like what microsoft has morphed into either)
True, also before, during, and after the Intel transition the ecosystem of indie and boutique apps for Macs was great. Panic and The Omni Group, just to name two boutique development companies, were probably at their peak in terms of desktop software. Besides, Mac OS X Tiger, Leopard, and Snow Leopard were polished and the UI was usable and cohesive.
Right now, the quality and attention to detail have plummeted. There is also a lot of iOS-ification going on. I wish they focused less on adding random features, and more on correctness, efficiency, and user experience. The attention to detail of UI elements in e.g. Snow Leopard, with a touch of skeuomorphism and reminiscent of classic Mac OS, is long gone.
Man, I love OmniGraffle. I guess design tools have generally improved over the years, but a couple of decades ago colleagues thought I was some kind of wizard for being able to easily whip up nice finite state machine diagrams in OmniGraffle.
Those of us that are old enough to remember, Apple is back in a John Sculley phase, however this time around there are no founders to rescue the company a second time.
Not that it needs to, as it isn't bleeding money like on the A/UX, Copland and Taligent/OpenDoc days, however they risk to become only the iDevices company.
Yeah, Microsoft apparently is also back on their former self.
> Mac platform is less cohesive and more cosmopolitan
Counter example: Blender
It used to have a extremely idiosyncratic UI. I will only say right click select.
A big part of the UI redesign was making it behave more like other 3d applications. And it succeeded in doing so in a way that older users actually liked and that made it more productive and coherent to use.
What I am saying is, those are different dimensions. You can have a more cohesive UI while adhere more to standards.
There is still lot of weird sacred cows that Macs would do very well to slaughter like the inverted mouse wheel thing or refusing to implement proper alt tab behavior.
You can have both, follow established standards and norms and be more cohesive.
The problem is simply that the quality isn't what it used to be on the software side. Which is following industry trends but still.
See, it's things like saying proper alt-tab behaviour that means we'll never solve it. While Windows invented alt-tab, the way macOS does it is the macOS way, so if it changed I would be far less productive.
I mean right click select is also objectively better than left click select and the blender way but we still solved it.
Didn't that stuff start around 1998 though? That's when they started pushing USB and networking by default.
And then OS X came along, with bash and Unix and all, and there was a lot of shared developer knowledge.
But they still managed to keep a very distinctive and excellent OS, for 20 years after that.
The quality has dropped only recently.
This situation persists: for instance, try to write to an external disk formatted with NTFS using the GUI tools alone. Baffling why Apple doesn't simply obtain a license in order to gain this capability. Big unnecessary inconvenience, primarily for their own users.
exFAT to the rescue
Unfortunately, if the Mac isn't distinct from Windows and desktop Linux in some way, then what's the point?
Yes, as a long-time Mac user who now uses PCs at home but still uses a work-issued MacBook Pro, I greatly appreciate how Macs since the late 1990s-early 2000s are compatible with the PC ecosystem when it comes to peripherals, networking, and file systems.
However, what has been lost is "The Macintosh Way"; a distinctly Macintosh approach to computing. There's something about using the classic Mac OS or Jobs-era Mac OS X: it's well-designed across the entire ecosystem. I wish Apple stayed the course with defending "The Macintosh Way"; I am not a fan of the Web and mobile influences that have crept into macOS, and I am also not a fan of the nagging that later versions of macOS have in the name of "security" and promoting Apple products.
What the Mac has going for it today is mind-blowing ARM chips that are very fast and energy efficient. My work-issued MacBook Pro has absolutely amazing battery life, whereas my personal Framework 13's battery life is abysmal by comparison.
What's going to happen, though, if it's possible to buy a PC that's just as good as an ARM Mac in terms of both performance and battery life?
> What's going to happen, though, if it's possible to buy a PC that's just as good as an ARM Mac in terms of both performance and battery life?
Their advantage against Microsoft is that the Mac UX may be degrading, but the Windows UX is degrading much more quickly. Sure modern Mac OS is worse to use than either Snow Leopard or Windows 7, but at least you don't get the "sorry, all your programs are closed and your battery's at 10% because we rebooted your computer in the middle of the night to install ads for Draft Kings in the start menu" experience of modern Windows.
Their advantage against Linux is that while there are Linux-friendly OEMs, you can't just walk into a store and buy a Linux computer. The vast majority of PCs ship with Windows, and most users will stick with what comes with the computer. It definitely is possible to buy a computer preloaded with Linux, but you have to already know you want Linux and be willing to special order it online instead of buying from a store.
That advantage only works in countries where general population can afford to buy Mac hardware, and since Apple will never make hardware at lower prices, Windows will keep having 70% of desktop market share at global scale.
> since Apple will never make hardware at lower prices
Apple has a deal with Walmart to sell the M1 Macbook Air for $600, so that's their current low-cost option. For the future, data-miners have found evidence that Apple will be making a new low-cost Macbook with the A18 Pro (chip from the iPhone 16 Pro), set to launch in 2026. https://www.macrumors.com/2025/06/30/new-macbook-with-a18-ch...
That is more than one month salary in many countries.
Don't compare prices from wealthy countries like US with the rest of the world.
How many M1 do you think Apple is selling in African countries?
As an example.
> However, what has been lost is "The Macintosh Way"; a distinctly Macintosh approach to computing. There's something about using the classic Mac OS or Jobs-era Mac OS X: it's well-designed across the entire ecosystem.
As someone who has never really enjoyed using macs, I do agree with this. It's probably why I don't mind them as much these days - Using MacOS in 2025 just kind of feels like a more annoying version of a Linux DE with less intent behind it. The way macs used to work did not jive with me well, but everything felt like it was built carefully to make sense to someone.
> As a positive I don't have the cross-platform woes anymore
It's certainly better than it was, that said Apple really try to isolate themselves by intentionally nerfing/restricting MacOS software to Apple APIs and not playing ball with standards.
> My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
My MacBook Pro has an integrated GPU that supposedly rivals that of desktop GPUs. However, I have to use a second computer to play games on... which really sucks when travelling.
Apple doesn't even have passthrough e-GPU support in virtual machines (or otherwise), so I can't even run a Linux/Windows VM and attach a portable e-gpu to game with.
The M5 was released and has a 25% faster GPU than M4. Great, that has no effect on reading HN or watching YouTube videos and VSCode doesn't use the GPU so... good for you Apple, I'll stick to my M1 + second PC set up
VSCode certainly does use the GPU, that's how you get 120fps scrolling.
You're right, but it's immaterial. You don't upgrade your GPU to run VSCode better
I actually did this because I use my TV as a monitor, and I had to upgrade the GPU to get better framerates over HDMI.
One thing that has changed though, and this is a big pet peeve of mine. Bluetooth fucking file sharing. You used to be able to send files using bluetooth between devices. I had some old ass Nokia from 2005 and I could send files to my Linux computer over bluetooth.
This standard function doesn't exist on iOS but has been replaced with AirDrop. It's a big fuck you from Apple to everyone who prefers open standards.
Ahhh, Bluetooth share ... I remember messing around with it in 2017 on some old Nokias and an Android phone. That was the last time it ever worked for me. It's been quietly supplanted or removed from my newer devices, and the pairing is quite finicky. Also, the transfer speeds (back then) were awful - Kb/s.
Now my go-to is Dropbox/cloud/Sharik for small files and rsync for bulk backups.
Sharik looks great! They need a canonical domain receivers can visit without entering IPs like https://pairdrop.net/ It even has a shortcut for the share menu!
https://github.com/schlagmichdoch/PairDrop/blob/master/docs/...
I was getting megabit speeds 20 years ago.
Maybe try localsend it‘s a great piece of software replicating airdrop but cross platform and open source.
https://localsend.org/
Until you try to send a file with an extension it’s not allowed to send on iOS like mkv. Apart from that’s great.
> much less a hard drive
This isn't true - my shining moment as a 10 year old kid (~1998) was when the HD on our Macintosh went out and we went down to compusa and I picked a random IDE drive instead of the Mac branded drives (because it was much cheaper) and it just worked after reinstalling macos.
You were lucky to have a Mac with IDE drives in 1998. AFAIK that was only the G3s and some lower-end Performa. I had a 9600 and there was no avoiding SCSI (well, I say that but I did put an IDE card in it at some point).
The true revelation was the B&W G3s. Those machines came from another universe.
we did not have a G3 (we definitely could not afford that). IIRC we had a 6400 or something of that vintage.
Yeah, looks like Apple's switch from SCSI to IDE started in '94. But the first couple of Macs my family had (SE, Quadra 605) would not have accepted an IDE drive.
"In the early days you couldn't swap a mouse between a Mac and an IBM PC, much less a hard drive or printer."
I mounted a 20MB external Apple hard drive:
https://retrorepairsandrefurbs.com/2023/01/25/1988-apple-20s...
... on my MSDOS system, in 1994, by attaching it to my sound card.
The Pro Audio Spectrum 16, weirdly, had a SCSI connector on it.
The SCSI would probably be meant for a CD-ROM drive. I recall my Sound Blaster Pro had a proprietary Panasonic CD-ROM interface, and at the time, soundcards often came bundled with a CD-ROM drive ( or is that vice-versa?).
It's not okay, because the experience is not good, and it can be.