Some images to demonstrate how retro games look on CRT vs unfiltered on a modern display:

https://x.com/ruuupu1

https://old.reddit.com/r/crtgaming/comments/owdtpu/thats_why...

https://old.reddit.com/r/gaming/comments/anwgxf/here_is_an_e...

Modern emulators have post-processing filters to simulate the look, which is great. But it's not quite the same as the real thing.

Blowing things up to that size is not representative.

Back when I first started playing things on emulators we were using 12" to 20" CRTs or LCDs with much higher resolution than a TV, so whether CRT or LCD the pixels were chunkier.

None of the nostalgia is how I remember it at all.

The average CRT TV had crap color and poor brightness and going from that and the flicker of 1-to-1 size NTSC on a 20-something TV to an emulated "chunkier pixel" rendition on a progressize-scan 72+hz 1024x768-or-higher CRT or an LCD looked way better.

Take the side by side pictures and zoom WAY out on a high-res screen or go stand several feet away from your monitor so that they're the size they were designed and expected to be seen at, and the vast majority of the perceived improvement from making the CRT subpixels visible goes away. And then put them into motion - especially vertical motion - and those lines in between, and losing half on each frame becomes more noticable and distracting.

The 4th image there of the yellow monster is a good example. Even zooming to 50% on my high-res display makes the "bad" version suddenly look way sharper and detailed as the size starts to show how frequently "rounded dots with gaps between it" just looks like fuzziness instead of "better".

And these comparisons tend to cherry-pick and not show examples of things that lose clarity as a result of the subpixels and scanlines instead of gain clarity.

I'm the same way. The scanlined, subpixeled versions just look terrible to me.

The article concerns 'PVMs', not a phrase I remember in period, even though we had hundreds of Sony D1 monitors, which were the pinnacle of 'professional digital monitors'.

These were different beasts to civilian TVs, even top of the line Trinitron. They had none on the RF circuitry of a regular TV and the inputs were typically component, or, in the late nineties, digital, but not the digital we know today, that signal came down one BNC connector.

We had an outside broadcast company which had massive trucks full of screens for televising sports, concerts and public events. A new boss decided to outfit the new trucks with domestic TVs rather than the expected mega-expensive D1s. The trucks did not last long, much to the amusement of the crew. The TVs rattled themselves to pieces before they made it to their first event.

Unlike the civilian TVs, the Sony D1 monitors were designed to be repaired. We had people for that and you could often see the innards of one of them if you went to see the engineers in their den. They generally did not need to be repaired, but, if you have hundreds of the things then you raised the odds of having a few need a little bit of servicing.

In the studio environment they were rack mounted with air conditioning and extremely neat cabling to some type of desk where you had the buttons to choose what camera, VT or other source went to the screen. Lighting in the gallery was also controlled, so the picture you saw was the definitive picture, with no fiddling of brightness or contrast necessary. The blacks were black, which flat screens were only really able to achieve decades later with AMOLED.

In the basement with the DigiBeta tape machines we had smaller D1s in the racks, often with an adjacent oscilloscope. You could tell if the content was 'adult material' by the oscilloscope, which I always found amusing.

The magic of TV in that era was the analog nature of the CRT. The studio set was usually very battered and yet you could put a few tens of thousands of watts of lighting onto it for the cameras to show something beautiful on the D1 monitors. The advent of HD was problematic in this regard as every dent and scratch would show, along with every wrinkle and blemish on the presenter's faces.

Video games of the era were designed around the hardware, in Europe this meant 720 x 576 PAL, with lots of that image as 'overscan'. Note that JPG was also designed for the magic of analog, with slow CPUs. You can change the look up table in JPG to make it work for digital and fast CPUs but only MozJPEG does that.

You mention flickering, and most CRTs would be flickery, think of electrical shops of the era and what you would see out of the corner of your eye. Clearly you would not want this in a studio gallery lest anyone collapse with an epileptic fit. In Europe we had 50Hz rather than 60Hz, so, even with interlacing, flicker was a thing but only in the electrical shop, not in the studio gallery. This had more to do with genlock (for analog) than phosphor persistence trickery.

Regarding the article, I am pleased that the D1 monitors of old have found a new fan base that truly appreciate them. In period we put a lot of work into our setups and, to this day, I struggle to come to terms with all of that expertise and expense having gone forever.

In broadcasting there has always been an 'old guard' that can remember the 'good old days'. I now feel like one of those fuddy duddies!!!

This helps validate my memories of SNES and PS1 games looking so much better when I was a kid than on an emulator today.

With 25% scanlines on PC CRT's they looked pretty close to TV's. On LCD's, forget it. Not even close, even with CRT filters.

I played SNES and PS1 games on a CRT. I played them on LCD and OLED TVs. I can’t tell the difference.

I mean I can tell that hdmi cables never introduce chromatic abberation something which was quite common on these old TVs when the SCART cables I used to use got old and I never had a LCD screen catch fire something which happened to me twice with aging CRT screens.

I really don’t get the nostalgia or whatever it is called when some of the people who think it was better then weren’t born at the time.

There are some effects that notoriously work only on rather specific combinations of screens AND cables. Those look horrible on emulators.

Usually it is effects involving transparency, some games for example literally rendered some things only on some frames and not the others, to achieve 50% transparency, others tried alternating scanlines, or the most crazy one: Sonic that made a transparent waterfall by relying on the fact that cables common at that time blurred pixels horizontally, thus it renders one column that is water and one column that is not, and hope they will be blurred into one single column that is 50% transparent water on top of the background.

Screenshot of the waterfall effect: https://i.ytimg.com/vi/1cGAP1i_4xQ/maxresdefault.jpg

Can't be shown with a screenshot: Axelay. I never seen that game running on a real CRT to compare, but on emulators that game look horrible, with distortions and flickering things everywhere, I was told this was not the intention at all, instead they relied heavily on CRT hardware to create pseudo-3D and transparency.

I think if you're looking from a normal couch to TV distance, you probably can't tell the difference.

If you're sitting computer desk to monitor distance (close enough to read text), you probably can tell a difference. From there, I think there is some false nostalgia at worst or subjectivity at best. Did computer monitors even have (noticeable) scanlines? I don't remember them while playing Wolfenstein or playing around in MS paint on windows 3.1... Maybe playing gorillas in DOS, but I can't remember...

If you're emulating snes on a modern PC, people like to say "games were developed with scanlines in mind", but they were also developed assuming you'd have a small TV and would be sitting ~8ft away from it!

Absolutely. I love playing Atari 2600 games, and it seems sacrilegious to play on anything but an old-school CRT TV.

Also, I’ve heard a CRT is required for NES light-gun games like Duck Hunt. Anyone know if this is true? I don’t have an NES, and if I did, I’d hook it up to my CRT, so I still wouldn’t know the answer :)

The NES light gun works with the properties the CRT provides... Roughly what happens is ... When you pull the trigger, the next frame is all black, and then one frame per target with a white square for the targer. If you're on target, the photodetector (photodiode? photoresistor?) will make a step change when the beam hits the white square, and the game code is looping to detect that. If the light comes late, it won't count; if it's not a big enough change, it won't count. If the screen was too bright during the black frame (or you were pointing at a light the whole time), it won't count.

Most modern displays are going to show the square too late, some might not be bright enough.

If you have an LED matrix and the right driving circuitry, you could probably replicate the timing, and that might work too, but I've not seen it done.

More details and options for LCDs https://www.retrorgb.com/yes-you-can-use-lightguns-on-lcds-s...

Yes, light guns/light pens actually relied on vertical/horizontal sync of the CRT screen to identify the position you pointed at, so they won't work on a modern screen.

> But it's not quite the same as the real thing.

To be fair, with modern "retina" HDR displays, it should be very very close.

The most important element of the CRT look is the fast phosphor decay. This is why CRTs have so little sample-and-hold blur. No other hardware can simulate it perfectly, but a 480Hz OLED display comes close:

https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks...

> it should be very very close

It should. It isn't. For some obscure reason, VGA colours look different on every modern LCD.

Most modern displays are calibrated, to some reasonable level, and can easily accommodate the very limited gamut of an old CRT, especially anything supporting HDR10. I suspect this is more of "they need to be fudged so they're wrong" more than anything.

I don't think old CRT gramut is "very limited". Only plasma screens were as good.

Plasma has great contrast and a slightly wider gamut than a CRT. Neither one have particularly good gamuts unless you're comparing to sRGB. Many current screens can do much better.

Most modern displays are calibrated, to some reasonable level, and can easily accommodate the very limited gamut of an old CRT. I suspect this is more of "they need to be corrected so they're wrong" more than anything.