> Nvidia abandons the gaming market

Citation? I've been hearing this from Gamer's Nexus for decades, but Nvidia seems to be fine RAM shortage notwithstanding.

I was thinking about the revenue shift as described in articles like this one: https://www.tomshardware.com/tech-industry/big-tech/nvidia-g...

Arguably, Nvidia has a point, probably more than the other companies, because they really are at the heart of the current buildout gold rush. So it's more actual economics for them than the FOMO it feels like for the other companies.

The last 3 generations of nvidia gpus have been a big middle finger to PC gamers in terms of price and power usage

Gone are the days of affordable graphics accelerators in the $300 to $500 range. Now it’s $1000 to $2000. 400 watts now instead of 100.

"Pay more and get less" has been the trend

https://www.xda-developers.com/shrinkflation-is-making-nvidi...

That's even before you get into bullshit like fake frames

> "Pay more and get less" has been the trend

That article doesn't support what you're saying whatsoever. GPU cores going down at the same price point is the opposite of shrinkflation, especially when you consider the US dollar is worth ~40% less than it was in 2012. And VRAM prices aren't going down anywhere, especially now.

> bullshit like fake frames

Fake frames are an option. You can play at native 4K/8K resolution, with the same 2.25-4x cost in power usage and raster compute. It will be miserable, but that's your choice.

> Gone are the days of affordable graphics accelerators in the $300 to $500 range. Now it’s $1000 to $2000.

What are you talking about? nVidia only has two models in the $1000 to $2000 range and they’re clearly premium parts.

The $300 to $500 cards are actually fine for normal gaming unless you demand to play at 4K at high settings.

> The $300 to $500 cards are actually fine for normal gaming unless you demand to play at 4K at high settings.

I don't think that wanting to play games at the native resolution of your screen without changing settings from their defaults in order to make the game look and perform much worse is a very unreasonable "demand".

That used to be possible without spending as much money and it's also not unreasonable for people to point that out

Fair, but there are 2025 games that don't run even well on the 5090. This is the fault of game developers who think they're making the next Crysis, targeting some hypothetical future hardware instead of providing a great experience on today's midrange hardware.

Looking at the best looking games from today vs 10 years ago, they're so similar it's hard to see where that extra performance is even going.

So far waiting ~5 years to bother with them has been a working strategy for me.

> That used to be possible without spending as much money and it's also not unreasonable for people to point that out

That used to be possible when the most common resolution was 1080p and refresh rates weren't pushing 240hz+.

Pretty much all the lower price cards are a bad buy. Nvidia is only competitive on performance at the absolute top end, where they have no competitors. In every other price bracket they lose to AMD and Intel.

You're right.

People want to pretend fundamentals of economics don't exist AND the company has moral obligations to fulfill to consumers. It's laughable.

It's not just nVidia, I've seen other expensive consumer brands getting the same sentiments.

> I've been hearing this from Gamer's Nexus for decades

I liked the idea of Gamer’s Nexus at first when it was supposed to be a data-first rigorous independent journalism.

Somewhere along the way it turned into a constant grievance and outrage channel. I guess audience capture pays the bills and YouTube Drama is hard to ignore. I haven’t bothered with that channel since they tried to go to war with Linus Tech Tips. I don’t even watch LTT and I certainly don’t want to watch two channels go to YouTube war against each other when I’m just trying to hear how the latest coolers perform or something.

I think a lot of the ultra cynical HN comments about how it’s the end of computing or how gamers have been abandoned are coming from these channels, though.

It's about which market segment gets priority in the company. Doesn't mean they'll stop making gaming cards altogether

Sure, that makes sense. I don't think anyone ever treated Nvidia like the "pure raster" competitor though. Sacrifices have been made for CUDA for 10+ years, when the Nintendo Switch shipped it was with automotive grade SOCs. Gamers have been chopped liver for decades, but they still get GPU releases and software products.

Looking at the flip side, Apple, AMD and Intel all eschewed compute performance for raster and have nothing to show for it. No "DLSS killer" in sight, no CUDA alternative, nothing. It seems like the gaming revenue is a ball-and-chain holding back profitable applications.