They were ok for the price... I think they were probably the most responsible for squeezing every bit of profitability from independent builders though. It really became a race to the bottom, combined with more interest in mobile/laptop computers.

I remember in the mid to late 90's, you could build a computer for someone and walk away with enough for an upgraded system for yourself. Of course the churn on performance was very real. IIRC, 1992 maxed out with a 486 DX2 @66mhz. Around 2000 we crossed the 1ghz mark from both Intel and AMD. We went from OG Doom that couldn't cut it full screen, to Half Life and Quake 3 Arena on Voodoo 3 and early NVidia cards.

Around 2000 I remember building myself new PCs every time someone wanted a gaming PC; I'd sell them my existing system and build myself the new hotness. I'd always buy just where the price/performance curve was starting to go vertical, so the 2-3 month old machine was still quite a good "deal".

That stopped being effective sometime before 2010. Instead I'd recommend buying a decent enough machine and sticking a graphics card in it.

Yeah.. late 90's to early 00's was pretty peak builder era. I learned through a relative fluke how much going a bit over the top on memory and faster drives really helped over even a faster CPU/GPU a lot of the time for general use. My current computer is literally the first I've built in decades that I didn't max out the RAM... I mean, I kind of did as 2x48gb was the most I could get in DDR5@6000 and only use 2 slots (or it would run much slower).

4th gen Core series was the longest I'd held onto a single PC (close to 5 years total for a 4790K). I did a mid-cycle gpu and nvme upgrade and that was it. I bumped to a 3950X/5950X and now 9950X since... AM3 is really the first socket in a long time I'd done an in-place upgrade for any CPU. My daughter's Ryzen 2400 to a 5000 series, and my own build from a 3600 -> 3950X -> 5950X... the 3600 was a placeholder as I couldn't get a 3950X for a few months.

I couldn't even name half the CPUs I ran from 1998 to 2005 or so... it was such a blur of upgrades every 6-12 months... I'd upgrade my computer, my wife's, my son's... etc. Then, things just completely stagnated... I mean there's been progress, but it's over the course of years, not seeing 2-3x in under a year.

I remember that it started to stall out on RAM before CPU (in that it became "reasonable" to have way more RAM than you really needed - Chrome didn't exist yet lol); that the very early move to multi-core was a bit of a downer (much couldn't use more than 2 or 3 cores so a "new CPU" with 4 instead of 2 cores but the same single-thread performance would be hardly noticeable).

Then of course there was the huge "replace everything with SSDs ASAP" performance bump, but ever since the later Core and before the M1, everything felt incremental. Nothing like the "Wolfenstein 3D to Quake Glide in 5 years" era.

Holy shit it was only 5 years - the M1 was released 6 years ago!

Froom Doom ~1993 to the Unreal engine ~1998, used even for Deus Ex, in just 5 years. For GenZers: a bigger jump than the one from the 3DS to the Switch.

By comparison, from the first Core Duo computers to today... we should have been roaming around, at street, level, in fully 3D rendered cities in Street View long ago, at least with Half Life 2 quality graphics and some high res textures and light bolted on to fake out the most complex effects.

And yet what we have it's the exact same rendering PC's and PowerPC macs did with virtual tour multimedia CD's in late 90's (and Encarta) but in a higher resolution. Literally. A 2D image rendered inside of an sphere.

That's it, something right now you can do in software without too much effort. You can get some 3D image and a dedicated viewer for it such as Panini with no advanced 3D support at all and what you are literally doing it's the same as JS scripts and WebGL are trying to do with Street View still images but with a Core Duo CPU or a high end Pentium 4.

Which is a technology closer to Myst in concept than the Unreal Engine. There's no actual progress as the one we lived through the 90's. Years passed as huge steps in the same way a kid aged 10 had near nothing to do in tastes with a 14yo one. Well, technology in our lives was the same. From walkmans to CD's and for crappy MSDOS computers in Mid Elementary to Windows 98 at nearly the high scrool.

Part of it is that things are slowing down, so the "solution" selected is the same - Unreal engine released in 1998; people still using evolved Unreal engine today.

I feel we haven't really had a different paradigm for decades, and I'm not sure AI even really is one.