> all sorts of anti-aliasing, lighting, etc, and a resolution race that can only be described as pathological

I enjoy CRT nostalgia now and then, but modern high resolution games are absolutely amazing. The blurry, low resolution, low refresh rate CRT look is fun for old games, but playing in 4K at 100+ fps on a modern monitor is an amazing experience on its own.

They're blurry in brand new ways due to how sloppy AI upscaling is.

There's also that godawful temporal shader or whatever that leaves black outlines drifting behind moving objects. I honestly cannot fathom why anyone would willingly use that effect, it straight up looks like a render error.

Plus the creative ways they generate new frames using AI without even rendering them. It's hilariously sad.

GamersNexus has a great video on it. I feel lucky that my golden era of gaming used 3DFX, early ATI and NVIDIA cards.

I don't have the space, time and enthusiasm to put together a compute-capable-space-heater to play games with great graphics but with meh stories, microtransactions, DLCs and thinly veiled gambling.

Why not (metaphorically) both? Consider a 16:9 Sony WEGA CRT that had HDMI input but also standard composite inputs.