Sometimes I think about the bizarre path computer technology took.

For instance, long-term storage. It would stand to reason that we'd invent some kind of big electrical array, and that's the best we could hope for. But hard drive technology (which relies on crazy materials technology for the platter and magnets, crazy high-precision encoders, and crazy physics like floating a tiny spring over the air bubble created by the spinning platter) came in and blew all other technology away.

And, likewise, we had liquid crystal technology since the 70s, and probably could have invented it sooner, but no need, because Cathode Ray Tube technology appeared (a mini particle accelerator in your home! Plus the advanced materials science to bore the precision electron beam holes in the screen grid, the phosphor coating, the unusual deflection coil winding topology, and leaded glass to reduce x-ray expose for the viewers) and made all other forms of display unattractive by comparison.

It's amazing how far CRT technology got, given its disconnect from other technologies. The sophistication of the factories that created late-model "flat-screen" CRTs is truly impressive.

The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").

Someday, maybe given advances in robotics and automation, I hope to start a retro CRT manufacturing company. The problems, such as the unavailability of the entire supply chain (can't even buy an electron gun, it would have to be made from scratch) and environmental restrictions (lead glass probably makes the EPA perk up and notice).

> like people in the 80s who swore that vinyl records "sounded better"

I'm not one of those people who ever thought vinyl sounded better than a properly recorded and mastered digital version and I've always believed a high-bandwidth digital audio signal chain can recreate the "warmth" and other artifacts of tube compressors well beyond the threshold of human perception, however a broadcast-quality, high-definition CRT being fed a pristine hi-def analog RGB signal can still create some visuals which current flat screens can't. This is only controversial because most people have never seen that kind of CRT because they were incredibly rare.

I got to see one of the broadcast production CRTs made to support NHK's analog high-definition video format in the 90s directly connected to HD broadcast studio cameras and the image quality was simply sensational. It was so much better than even the best consumer CRT TVs, that it was simply another thing entirely. Of course, it cost $40,000 and only a few dozen were ever made but it was only that expensive because these were prototypes made years before digital hi-def would be standardized and begin mass production.

In fact, I think if it was A/B compared next to a current high-end consumer flat screen, a lot of people would say that CRT looks more pleasing and overall better. For natural imagery a CRT could render the full fidelity and sharpness of a 1080 image but without that over-crisp 'edginess' today's high-end flat screens get. And those "cathode rays" can render uniquely rich and deep colors vs diodes and crystals. Of course, for synthetic images like computer interfaces and high-dpi text, a flat screen can be better but for natural imagery, we lost something which hasn't yet been replaced. I'd love to see an ultra high-end CRT like that designed to display modern uncompressed 4K 12-bit HDR digital video.

I had a music teacher that insisted analog recordings were different.

One day she said there is a simple way to prove it. Certain stringed instruments have the string move on their own to the correct note if you put them near a source of similar sound. If you put these instruments in front of a speaker playing from an analog source and have the strings move, then play the exact same music but from a digital source on the same speaker, the strings stop moving, even if to most humans it sounds exactly the same.

Sadly I never had the gear to test this, I am not a professional musician and was learning from that person as a hobby (she is a teacher for professional musicians).

If you do ever test this, and do it rigorously (i.e. using analogue and digital versions of the same recording, with no pitch inaccuracies) you'll find the strings will resonate equally well with analogue and digital recordings, all other things (volume, tuning of the instrument, etc.) being equal.

The problem is that all other things are no longer equal, and have not been for quite some time.

Retuning digital audio to 440Hz equal temperament is an industry norm now, even for (say) re-issued 1970s stuff. You just won't get modern digital versions that are the same as the analogue versions, and the equal temperament stuff thus won't pass a resonance test unless the test instrument is also equal temperament, which most string instruments of course are not.

The far easier test for amateurs nowadays is not to buy a whole string instrument, but to use pitch monitoring applications, which all too readily show when a sound is bang-on the specific equal temperament frequencies.

Obligatory recent Fil Henley:

* https://youtube.com/watch?v=0x5dfbqE5hE

Auto tune pitch correction is entirely separate from whether a properly engineered digital recording can match an analog recording to a level well beyond the ability of human biology to detect any difference in randomized, controlled, double-blinded ABX testing (https://en.wikipedia.org/wiki/ABX_test).

> The problem is that all other things are no longer equal

There are many digital recordings which have no pitch correction or other tonal manipulation applied. In those cases, all things are still equal for the purposes of the statement above.

As a separate matter, I agree auto-tune and other manipulation can be inappropriately or excessively applied, however over manipulation isn't unique to digital, it occurred in the analog era too – such as dynamic range compression and multi-band dynamic equalization. Those tools existed in tube-based, purely analog form long before digital recording became the norm and caused similar complaints when they were misapplied. There were even analog pitch correctors although they weren't nearly as flexible or precise as today's digital versions.

I find this dubious since the effect she was describing is caused by resonance frequency. Since, in the example provided, the source is an amplified speaker pushing air in both cases the outcome should be the same. The more famous test of this principle is the breaking of a glass and I would be surprised if this hadn't been done with digital signal inputs.

> I find this dubious

I agree. In both cases a continuously varying voltage is driving speaker cone deflection. If the voltages of two different signals vary in precisely the same way, the cone will deflect to exactly the same degree and the resulting pressure wave will generate the same resonant response from any surface it encounters. When properly implemented, today's high-end, esoteric ADC and DAC converters have insane bandwidth, frequency response and fidelity far exceeding these requirements.

Some of the confusion comes from the fact that back when consumer audio transitioned to digital and these production workflows were new, some early digital recordings were incorrectly engineered or mastered creating artifacts such as aliasing which critical listeners could hear. Some people assumed the artifacts they heard were innate to all digital audio instead of just incorrect implementation of a new technology. Even today, it's possible to screw up the fidelity of a digital master but it's rarely an issue because workflows are standardized and modern tooling has default presets based on well-validated audio science (for example: https://en.wikipedia.org/wiki/Noise_shaping#Dithering). But even in the analog era it was always a truism in audio and video engineering that "there are infinite ways to screw up a signal but only a few ways to preserve it." And it remains true today. To me, one of the best things about modern digital tooling is it's much easier to verify correctness in the signal chain.

Have you looked at any high end OLEDs lately?

Yes, my background is in broadcast video engineering. An edit suite I'm in regularly has a $10,000 24-inch BVM-E251 reference monitor for color grading. At home I have a $4,000 LG C5 OLED. My dedicated home theater room is based around a $12,000 4k laser projector. I also own a Sony BVM series broadcast CRT, various Trinitrons CRTs and a retro gaming arcade cabinet built around a 25-inch analog RGB industrial CRT (Wells Gardner D9200). I use an optical colorimeter for display calibration.

All of these displays are unique tools each with differing capabilities. I own and use them all for what they are best at. Flat panel technologies can produce incredible visuals with certain strengths no CRT can replicate (when properly calibrated and given a high-quality source signal). However, the reverse is also true, extremely high-end cathode ray technology, with an appropriately high definition dot/shadow mask and phosphors, can generate visuals with traits no current flat panel display technology can duplicate. To be clear, I'm not talking about any CRT consumer television you've ever seen. A decent OLED display of today can look far better than even the best 1990s televisions, but consumer televisions were standard definition and hard-limited to less than 6 MHz of signal bandwidth (usually much less), so any comparison between the fundamental display technologies ('cathode ray-irradiated phosphors' vs 'light emitting diodes') is meaningless if not evaluated with the same resolution and bandwidth input signal. And you've never seen a high-definition CRT like a KW-3600HD fed with a 30 MHz HD source signal. But they exist and I've seen one.

Everything in display engineering involves trade-offs. CRTs and light emitting diodes are based on different materials with fundamentally different optical properties and underlying physics. Each has their own unique strengths. Neither can fully replicate the entire range of the other in every respect. This is not a personal aesthetic opinion, it's a carefully qualified technical assessment based on objective measurement and it's consistent with the physical capabilities of the respective technologies.

Imo OLED has completely eclipsed CRT by now.

I don't know enough to say where CRTs could be today if they had gotten the development $ that went into other tech. But to be as good as OLEDs they would have had to find something else than phosphor as the inner coating.

For response times, CRT will always remain the king of dark-to-light response times, but afterglow for bright-to-dark would always be a factor unless a different coating was developed. OLEDs have no such issues. Subjectively, the claimed < 0.1 ms response times are real and there are zero artifacts, no afterglow, no ghosts, just extremely sharp and defined motion.

Ghosting from long display of static image is real.

> It's amazing how far CRT technology got

And China is still building, today, brand new CRT boards for CRT TVs and monitors. You can buy them on AliExpress.

I don't know if CRT themselves are still being built though.

I'm hanging on to my vintage arcade cab from the 80s with its still-working huge CRT screen. Hope I fail before that thing (and I hope it doesn't fail anytime soon!).

> The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").

CRTs don't have particularly good refresh rates. There is very little delay on the output scan, but 99% of the time the delays built into rendering make that irrelevant compared to fast screens using other technologies. And the time between scans doesn't go very low.

I have no idea what you mean by internal glow.

The heated filament in many old CRTs would glow orange.

Okay, then that was already pretty much gone by the time we switched to LCD.