For games its generally not worthwhile since the performance is almost entirely based on gpu these days.

Almost all build guides will say ‘get midrange cpu X over high end chip Y and put the savings to a better GPU’.

Consoles in particular are just a decent gpu with a fairly low end cpu these days. The xbox one with a 1.75Ghz 8core AMD from a couple of generations ago now is still playing all the latest games.

Depending on the game there can be a large difference. Ryzen with larger cache have a large benefit in singleplayer games with many units like civilization or in most multiplayer games. Not so much GHz speed but being able to keep most of hot path code and data you need in cache.

Anecdote: I got a massive performance (FPS) improvement in games after upgrading CPU recently, with no GPU change.

I think currently, that build guide doesn't apply based on what's going on with GPUs. Was valid in the past, and will be valid in the future, I hope!

How old was your previous CPU? Different people have vastly different expectations when it comes to upgrading. I'm certain I can play all of the games that I'm interested in on my 3 year old Ryzen 7600x, and that I'm limited by the 5 year old GPU (which I dread upgrading because of the crunch). Would someone with a 5 year old CPU be well served by upgrading to a 9600x, absolutely. But some people think they have to upgrade their Threadripper every year.

(As far as work goes, I realize this directly contradicts the OP's point, which is the intent. If you know your workflow involves lots of compiling and local compute, absolutely buy a recent Threadripper. I find that most of the time the money spent on extra cores would be better spent on a more modest CPU with more RAM and a faster SSD. And more thoughtful developer tooling that doesn't force me to recompile the entire Rust work tree and its dependencies with every git pull.)

I think it was a Radeon 3600x, state of the art 6-7 years ago. Replaced with 9950x. I was surprised by how big of a difference the CPU update had on frame rates. (GPU: 4080)

I also do a lot of rust compiling (Which you hinted at), and molecular dynamics sims leveraging a mix of CUDA/GPU, and thread pools + SIMD.

Makes sense, yeah, a 3600x is far behind the curve now.

Edit: Took a look at AMD's lineup and realized they did something I got conditioned not to expect: they've maintained AM5 socket compatibility for 3 generations in a row. This makes me far more likely to upgrade the CPU!

https://www.amd.com/en/products/processors/chipsets/am5.html

> all AMD Socket AM5 motherboards are compatible with all AMD Socket AM5 processors

I love this. Intel was known to change the socket every year or two basically purely out of spite, or some awful marketing strategy. So many wasted motherboards.

Oh wow. Didn't save me though. I've never been able to drop a new CPU into a motherboard - it's always CPU + RAM + MB time due to the socket consideration you mention.

[deleted]

Even for compilation workloads, you need to benchmark beforehand. Threadrippers have lower boost clocks and (in the higher core count models) lower base frequencies than the high end Ryzen desktop CPUs. Most build systems are not optimized for such high core counts.

>> For games its generally not worthwhile since the performance is almost entirely based on gpu these days.

It completely depends on the game. Civilization series, for example, are mostly CPU bound, which is why turns take longer and longer as the games progress.

Factorio and stellaris are others i’m aware of.

Factorio it's an issue when you go way past the end game into the 1000+ hour megabases.

Stellaris is just poorly coded with lots of n^2 algorithms and can run slowly on anything once population and fleets grow a bit.

For civilisation the ai does take turns faster with a higher end cpu but imho it’s also no big deal since you spend most time scrolling the map and taking actions (gpu based perf).

I think it’s reasonable to state that the exceptions here are very exceptional.

If Civ/Stellaris devs can't handle something as basic iteration vs recursion, then they are damn lost.

Also, a language with a GC (not Java) would shine there, it's ideal for a turn based game.