I have had the system for eight years and at no point would upgrading RAM have increased performance.
Upgrading the RAM would have created more waste than properly sizing the RAM to COU proportion from the beginning.
It is very odd to encounter someone who has such a narrow view of computing that they cannot imagine someone not upgrading their RAM.
I have not once, literally not once have RAM break either. I have been part of the management of clusters of hundreds of compute nodes, that would occasionally each have their failures, but not once was RAM the cause of failure. I'm fairly shocked to hear that anybody's RAM has failed, honestly, unless it's been overlocked or something else.
I'm with you on this one. I've had.. 6? PCs. Basically every time I thought that they were falling behind performance wise, I realized that they generally had stopped selling RAM for them and even if I only wanted to upgrade the RAM, it wasn't enough anymore. The CPU was also falling behind and a new one needed a new socket and motherboard.
> It is very odd to encounter someone who has such a narrow view of computing that they cannot imagine someone not upgrading their RAM.
Uncalled for and means the end of the discussion after this reaction. Ofc I can imagine that, it's just usually a dumb decision.
That you did not have to upgrade the ram means one of two things: You either had completely linear workloads, so unlike me did not switch to a compiled programming language or experimented with local LLMs etc. Or you bought a lot of ram in the beginning, so 8 years ago with a hefty premium.
Changes nothing about the fundamental disagreement with the existence of such machines. Especially from a company that knows better. I do not expect ethical behaviour from a bottom of the barrel company like Apple, but it was completely reasonable to expect better from framework.