If you don't care about power efficiency, sure ;)

I bought a geforce RTX 3080 at launch and boy was I surprised at the power draw and heat/noise it pumps out. I wonder why anybody bothers with the 90 series at all.

I actually run it ~10% underclocked, barely affects performance, but greatly reduces heat/noise. These cards are configured to deliver maximum performance at any cost (besides system instability).

My next GPU I am probably going mid-range to be honest, these beefy GPUs are not worth it anymore cost and performance-wise. You are better off buying the cheaper models and upgrading more often.

> I bought a geforce RTX 3080 at launch and boy was I surprised at the power draw and heat/noise it pumps out. I wonder why anybody bothers with the 90 series at all.

More VRAM, and NVLink (on some models). You can easily run them at lower power limits. I've run CUDA workloads with my dual 3090s set as low as 170W to hit that sweet spot on the efficiency curve. You can actually go all the way down to 100W!

Oh well, for GPU programming sure as VRAM is king depending on task. But for gaming I won't go high end again.

I'm all in on 1030, the last passively coolable GPU.

But had to upgrade to 3050 because 2GB VRAM is to little for modern titles.

Fun fact: One 6600 core can saturate the 1030 for skinned mesh animations.

But only saturate the 3050 50% = perfect because the world takes much less CPU (you upload it to the GPU and then it's almost one drawcall; more like one drawcall per chunk but I digress) = OpenGL (ES) can render on it at lower motion-to-photon latency without waste = one core for rendering, one for gameplay, one for physics and one for the OS, audio and networking.

So 14nm 6600 + 8nm 3050 is actually the combination I would use for ever.

HL:A runs at 90 FPS with that combo too on the low res Vive 1.

Not that VR is going anywhere, but still peak demanding application.

When i bought the 3080ti, it really made me to rethink about my setup. I'm using space heater in an Air-conditioned room, wasting more energy for nothing.

Underclock it, it helps a lot.

[deleted]

When it comes to: "is it better to throw away the 6600 and replace it with a 5600, than keep running the 6600" I'm torn but: you probably need to use the 5600 for maybe 20 years to compensate for it's manufacturing energy cost (which is not directly linked to $ cost) and I think the 6600 might last that long with new RAM 10 years down the road, not so sure the newer 5600 motherboard and the CPU itself will make it that long becuse it was probably made to break to a larger extent.

Also the 6600 can be passively cooled in the Streacom case I allready have, the 5600 is to hot.

This is a little confusing because you were referring to a Core i5-6600 and presumably an i5-7500T or i7-7700T above but now you mention a 5600. Are you referring to a Ryzen 5 5600?

Yes

As we discussed elsewhere you could put the Ryzen 5 5600 into 45W ECO mode and cool it passively. Or a Ryzen 5 7600 if you decide to jump up to AM5 (which is probably a good idea even though you'll need new RAM).

As I probably replied there too I do not want to build hardware that can burn my house down if the wares are crap and decide to reset for some reason.

It will simply throttle.

Which is not good for longevity.