One can see it that way, granted. When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc. All our beloved consumer tech started out as absurdly high priced niche stuff for them. We've been sold the overflow capacity and binned parts. And that seems to be a more-or-less natural consequence of large purchasers signing large checks and entering predictable contracts. Individual consumers are very price sensitive and fickle by comparison. From that perspective, anything that increases overall capacity should also increase the supply of binned parts and overflow. Which will eventually benefit consumers. Though the intervening market adjustment period may be painful (as we are seeing). Consumers have also benefited greatly from the shrinking of component sizes, as this has had the effect of increasing production capacity with fixed wafer volume.

> When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc.

Perfectly stated. I think comments like the one above come from a mentality that the individual consumer should be the center of the computing universe and big purchasers should be forced to live with the leftovers.

What's really happening is the big companies are doing R&D at incredible rates and we're getting huge benefits by drafting along as consumers. We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.

The iPhone wasn't designed or marketed to large corporations. 3dfx didn't invent the voodoo for B2B sales. IBM didn't branch out from international business machines to the personal computer for business sales. The compact disc wasn't invented for corporate storage.

Computing didn't take off until it shrank from the giant, unreliable beasts of machines owned by a small number of big corporations to the home computers of the 70s.

There's a lot more of us than them.

There's a gold rush market for GPUs and DRAM. It won't last forever, but while it does high volume sales at high margins will dominate supply. GPUs are still inflated from the crypto rush, too.

> The iPhone wasn't designed or marketed to large corporations.

The iPhone isn't exactly a consumer computation device. From that perspective, it does less work at a higher cost.

Advances in video cards and graphics tech were overwhelmingly driven by video games. John Carmack, for instance, was directly involved in these processes and 'back in the day' it wasn't uncommon for games, particularly from him, to be developed to run on tech that did not yet exist, in collaboration with the hardware guys. Your desktop was outdated after a year and obsolete after 2, so it was a very different time than modern times where you example is not only completely accurate, but really understating it - a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable and well dated cards.

yes. a good reason to upgrade was PCIe 4.0 for I/O. GPU and SSD needs caused PCIe 5.0 to follow soon after.

> We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.

Arguably we don't. Most of the improvements these days seem to be on the GPGPU side with very little gains in raster performance this decade.

Gaming drove the development of GPUs which led to the current AI boom. Smartphones drove small process nodes for power efficiency.

SGI and 3Dfx made high-end simulators for aerospace in the beginning. Gaming grew out of that. Even Intel's first GPU (the i740) came from GE Aerospace.

Wolfenstein 3d was released before 3DFx existed, was purely CPU rendered, and generally considered the father of modern 3d shooters. Even without the scientific computing angle, GPUs would have been developed for gaming simply because it was a good idea that clearly had a big market.

Flight simulators just had more cash for more advanced chips, but arcade games like the Sega Model 1 (Virtua Racing) was via Virtua Fighter an inspiration for the Playstation, and before that there was crude games on both PC and Amiga.

Games were always going to go 3d sooner or later, the real pressure of the high volume competitive market got us more and more capable chips until they were capable enough for the kind of computation needed for neural networks faster than a slow moving specialty market could have.

> Flight simulators just had more cash for more advanced chips

Yes. That is my point. The customers willing to pay the high initial R+D costs opened up the potential for wider adoption. This is always the case.

Even the gaming GPUs which have grown in popularity with consumers are derivatives of larger designs intended for research clusters, datacenters, aerospace, and military applications.

No question that chip companies are happy to take consumers money. But I struggle to think of an example of a new technology which was invented and marketed to consumers first.

It's symbiotic, I suppose.

3dfx didnt. They had a subsidiary? spinoff? Quantum3D that reused 3dfx commodity chips to build cards for simulators.

100%. We’ve seen crazy swings in RAM prices before.

A colleague who worked with me about 10 years ago on a VDI project ran some numbers and showed that if a Time Machine were available, we could have brought like 4 loaded MacBook Pros back and replaced a $1M HP 3PAR ssd array :)