To me there is a fundamental difference. Even if PC hardware costs slightly more (now because of the RAM situation, Apple producing his chips in house can get better deals of course), it's something that is worth more investing in in.
Maybe you spend 1000$ more for a PC of comparable performance, well tomorrow you need more power, change or add another GPU, add more RAM, add another SSD. A workstation you can keep upgrade it for years, adding a small cost for an upgrade in performance.
An Apple machine is basically throw away: no component inside can be upgraded, you need more RAM? Throw it away and buy a new one. You want a new GPU technology? You have to change the whole thing. And if something inside breaks? You of course throw away the whole computer since everything is soldered on the mainboard.
There is then the software issue, with Apple devices you are forced to use macOS that kind of sucks, especially for a server usage. True nowadays you can install Linux on it, but the GPU it's not that well supported, thus you loose all the benefits. You have to stuck with an OS that sucks, while in the PC market you have plenty of OS choices, Windows, a million of Linux distributions, etc. If I need a workstation to train LLM why do I care about a OS with a GUI? It's only a waste of resources, I just need a thing that runs Linux and I can SSH into it. Also I don't get the benefit of using containers, Docker, etc.
Mac suck even hardware side form a server point of view, for example it's not possible to rack mount them, it's not possible to have redundant PSU, key don't offer remote KVM capability, etc.
"Upgrades" havent been a thing for nearly a decade. By the time you want to upgrade a machine part (c. 5yr+ for modern machines), you'd want to upgrade every thing, and its cheap to do so.
It isnt 2005 any more where RAM/CPU/etc. progress benefits from upgrading every 6mo. It's closer to 6yr to really notice
> By the time you want to upgrade a machine part (c. 5yr+ for modern machines), you'd want to upgrade every thing,
That's only the case for CPU/MB/RAM, because the interfaces are tightly coupled (you want to upgrade your CPU, but the new one uses an AM5 socket so you need to upgrade the motherboard, which only works with DDR5 so you need to upgrade your RAM). For other parts, a "Ship of Theseus" approach is often worth it: you don't need to replace your 2TB NVMe M.2 storage just because you wanted a faster CPU, you can keep the same GPU since it's all PCIe, and the SATA DVD drive you've carried over since the early 2000s still works the same.
Even this is understating it; if you buy at the right point in the cycle, you can Ship-of-Theseus quite a while. An AM4 motherboard released in Feb 2017 with a Ryzen 1600X CPU, DDR4 memory and a GTX780 Ti would be a obsolete system by today's standards. Yet, that AM4 motherboard can be upgraded to run a Ryzen 5800X3D CPU, the same (or faster) DDR4 memory, and a RTX 5070Ti GPU and be very competitive with mid-tier 2026 systems containing all new components. Throughout all this, the case, PSU, cooling solution, storage could all be maintained, and only replaced when individual components fail.
I expect many users would be happy with the above final state through 2030, when the AM6 socket releases. That would be 13 years of service for that original motherboard, memory, case and ancillary components. This is an extreme case, you have to time the initial purchase perfectly, but it is possible.
You can keep CPU and RAM for way longer than the GPU if you game...
Your point kind of disproves your point.
https://store.steampowered.com/hwsurvey/videocard/
That's news to me. I see Mac Minis with external drives plugged-in constantly; I bet those people would appreciate user-servicable storage. I doubt they bought an external drive because they wanted to throw away the whole computer.
Mac minis have user serviceable storage: https://store.m4-ssd.com/products/third-party-ssd-for-mac-mi...
External drive is workaround to apple’s pricing scheme, often purchased at same time as computer.
you need more RAM? Throw it away and buy a new one.
Or sell it, which is much easier to do with Macs because they're known quantities and not "Acer Onyx X321 Q-series Ultra".
There is then the software issue, with Apple devices you are forced to use macOS that kind of sucks, especially for a server usage
That's a fair point. Apple would get a ton of goodwill if they released enough documentation to let Asahi keep up with new hardware. I can't imagine it would harm their ecosystem; the people who would actually run Linux are either not using Macs at all, or users like me who treat them as Unix workstations and ignore their lock-in attempts.
I think most of that is really opinion and experiences. No doubt it’s not designed or built truly for racks but folks have been making rack mounts for Mac minis since they first came out.
On the upgrade path I don’t think upgrades are truly a thing these days. Aside from storage for most components by the time you get to whatever your next cycle is, it’s usually best/easiest to refresh the whole system unless you underbought the first time around.
>>Mac suck even hardware side form a server point of view, for example it's not possible to rack mount them, it's not possible to have redundant PSU, key don't offer remote KVM capability, etc.
https://atp.fm/683
As others have said, that's just not the reality of a modern work machine. If I need a new GPU or more RAM, I'm positive I need everything else upgraded too
> with Apple devices you are forced to use macOS that kind of sucks, especially for a server usage
you can just install linux?
Only really possible with the M1. If referring to Asahi.
> You have to stuck with an OS that sucks, while in the PC market you have plenty of OS choices, Windows, a million of Linux distributions
Windows is 10x more enshittified than OSX
> An Apple machine is basically throw away: no component inside can be upgraded, you need more RAM? Throw it away and buy a new one.
Tell that to all the people rocking 5-10 year old macbook that still run great