65W TDP? Let's say we want to run a PC so we're switching to a newer low-end Ryzen with a 35W TDP and that that's a 30W difference for the whole system. Let's say we're running the system 24/7 and the CPU is pulling its full TDP constantly. Average US residential electricity price is $0.18/kWh.

0.03 kW * 24 h * 365 d * $0.18 = $47.30/year

In the UK, residential electricity tariffs are currently capped by the regulator at 27.69p per kWh, resulting in a total yearly cost of £72.77. Much higher than in the US, but still much cheaper than a new PC.

£72.77 is more than enough for a PC: https://www.ebay.co.uk/itm/377057425659

PC power draw at the wall is different than TDP. Idle power goes to a lot of components.

Even CPU TDP is not an accurate measure. My latest AMD CPU will pull more than it’s rated “TDP” under certain loads.

Yup. But from the OP, all the information we have is the CPU model, and the GP decided that was enough to say it should be thrown in the trash for power inefficiency, so I thought it was enough for some bad math.

(FWIW, searching for the CPU model brings up an old review where the full system they’re testing pulls 145W under some amount of load. While that’s not nothing, it’s also not outrageous for a desktop PC that does the desktop PC things you require of it.)

So $50/yr for 4 years gives you ~$150 with $50 extra for shipping or whatever, which gets you a decent Lenovo M700 Tiny with much better performance in both power and power consumption.

I guess. It's hardly an open-and-shut case of "throw your old computer away!" though, especially when this is a worst-case scenario of running a desktop computer at full blast 24/7 without it ever going into sleep mode or being turned off, and when you don't know what the user's needs are. Maybe a mini-PC with basically no expansion just won't really work for them?

Watts in TDP are not the same as watts in electricity, although they're both measures of energy.

TDP is a thermal measurement, it's how much heat energy your heatsink and fan need to be able to dissipate to keep the unit within operational temperatures. It does not directly correlate to the amount of electricity consumed in operation.

I know, but it should be roughly correlated and only serves as comparison for wildly inaccurate napkin math anyway.

It’s close enough. Computers mostly make heat with some math as a distant second.