I bought a geforce RTX 3080 at launch and boy was I surprised at the power draw and heat/noise it pumps out. I wonder why anybody bothers with the 90 series at all.
I actually run it ~10% underclocked, barely affects performance, but greatly reduces heat/noise. These cards are configured to deliver maximum performance at any cost (besides system instability).
My next GPU I am probably going mid-range to be honest, these beefy GPUs are not worth it anymore cost and performance-wise. You are better off buying the cheaper models and upgrading more often.
> I bought a geforce RTX 3080 at launch and boy was I surprised at the power draw and heat/noise it pumps out. I wonder why anybody bothers with the 90 series at all.
More VRAM, and NVLink (on some models). You can easily run them at lower power limits. I've run CUDA workloads with my dual 3090s set as low as 170W to hit that sweet spot on the efficiency curve. You can actually go all the way down to 100W!
Oh well, for GPU programming sure as VRAM is king depending on task. But for gaming I won't go high end again.
I'm all in on 1030, the last passively coolable GPU.
But had to upgrade to 3050 because 2GB VRAM is to little for modern titles.
Fun fact: One 6600 core can saturate the 1030 for skinned mesh animations.
But only saturate the 3050 50% = perfect because the world takes much less CPU (you upload it to the GPU and then it's almost one drawcall; more like one drawcall per chunk but I digress) = OpenGL (ES) can render on it at lower motion-to-photon latency without waste = one core for rendering, one for gameplay, one for physics and one for the OS, audio and networking.
So 14nm 6600 + 8nm 3050 is actually the combination I would use for ever.
HL:A runs at 90 FPS with that combo too on the low res Vive 1.
Not that VR is going anywhere, but still peak demanding application.
When i bought the 3080ti, it really made me to rethink about my setup. I'm using space heater in an Air-conditioned room, wasting more energy for nothing.
Underclock it, it helps a lot.