So. Nvidia abandons the gaming market, Microsoft abandons Windows and Google abandons Search, all because AI is supposedly more important.
At some point it starts to feel like a drug for C-suites.
So. Nvidia abandons the gaming market, Microsoft abandons Windows and Google abandons Search, all because AI is supposedly more important.
At some point it starts to feel like a drug for C-suites.
The drug is the dream of replacing their serfs (you) with robots. They feel like what passes for AI is sooo close, and they're willing to destroy everything and anything to make their dream reality, but I think they're going to be disappointed. Until they face reality we're all going to increasingly suffer.
Not even really "Until they face reality". I think "we're all going to increasingly suffer" suffices. Not to be too provocative, but I think that even if LLMs don't quite pan out for what they dream of using them for, there's still about a billion other things that can be tightened up to make your life more miserable and their lives more profitable. More ads, more abusive trickery, more 'attestation'/'safety'/'integrity', more subscription services, more discontinuations of old software and features that worked perfectly fine, more total oversight and tracking. It's the trajectory we've been on for a very long time, and there's no reason for it to change.
The C-suites are there to please the shareholders more than the customers. The shareholders hear how much AI companies are supposedly making and believe money is being left on the table lest their co-owned companies invest in AI too. The C-suites then go after AI as none of them wants to be told they haven't done anything to profit off it.
Hum. Windows has been ridden with bugs forever. I don’t see how this is connected to Microsoft abandoning windows in favor of AI.
Windows code base is just too heavy to maintain. They need to break compatibility with older products like MacOS often does, so that Windows can be manageable again… but that goes against Microsoft philosophy it seems.
I only used Windows at work and for games in a VM, so take that with a grain of salt:
Older Windows bugs seemed fair: mostly edge cases, weird UI interaction, or stuff that only came out under heavy workload (also, windows file system).
This past few year, the bugs are incomprehensible. I understand non-professional versions are considered as Beta since Win10, but what it felt like is that Home version are actually alpha, and windows pro seems more and more like a beta.
NT4 had many serious BSODs. SP6 was so problematic due to a critical bug in LSA that it was re-released as SP6a.
Windows bugs have moved more and more into the 'edge case' territory. Not that major issues don't crop up for "everyone" today, but BSODs used to be much more common. Part of that was due to the architecture, thus drivers, but the other side of it was core Windows functionality that just had bugs.
Kernel is almost perfect these days. Can't say the same about user environment. Explorer and shell are buggiest ever.
Explorer is the shell ;)
But Explorer has had it's fair share of issues. I have a 98SE machine to prove the stalls, lockups, lack of refreshing directories, etc...
Not anymore, not entirely. Start menu and taskbar are now in a separate thing I believe. Welcome to 20s!
Explorer.exe is still the shell -- the shell is defined at HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon\Shell, if you want to look (or replace it).
That's not the whole story. The challenges associated with legacy app support have nothign to do with ads, telemetry, reCall and jamming AI into every crevice. Microsoft is doing both things wrong and 11 is a hot mess for that.
>Hum. Windows has been ridden with bugs forever
Windows had a reasonable share of bug analogous to its huge breadth and backwards compatibility needs. Otherwise, it was very stable and mature.
Now it's gotten way worse...
Yes, but it happened before ChatGPT. Windows 11 was released in 2021, already with shitty taskbar, search, and start menu.
Yes, but those were bad by choice.
Whereas the parent alluded to bugs just been piling gratuitously.
I didn't mean poor design, I meant literal bugs. Start menu search still fails to find things on my desktop, and done so since Win 11. Thank God for Everything.
And Start11 (which can integrate with Everything!)
I mean, define now. Windows has been progressively getting more buggy for a long time, before windows 11 or even 10. Windows 11 is pretty bad but it's been bad. Since day 1 the taskbar crashes for me a few times a day, file explorer crashes, and random things have stupid amounts of lag.
It more or less worked fine until they fired all the QA staff and bundled Windows with adware.
My youtube ads that I get are all fear about AI "You have to do this or you'll be out of a job!" and so on.
It's madness.
To be fair to Nvidia, they're only selling the pickaxes in this gold craze. It's smart for them to focus on the market that pays the most. When the AI bubble pops, they'll still be in a great position and they'll be able to return to gaming.
Are they only selling pickaxes? I thought they were also investing in prospecting firms, who then turn around and buy more pickaxes. Seems kind of dodgy, but also a lot riskier than just selling pickaxes.
They are training AI models.
They are investing in AI companies (OpenAI).
[flagged]
I'm not seeing the connection.
It’s racism.
What? How does that relate to the above?
> Nvidia abandons the gaming market
Citation? I've been hearing this from Gamer's Nexus for decades, but Nvidia seems to be fine RAM shortage notwithstanding.
I was thinking about the revenue shift as described in articles like this one: https://www.tomshardware.com/tech-industry/big-tech/nvidia-g...
Arguably, Nvidia has a point, probably more than the other companies, because they really are at the heart of the current buildout gold rush. So it's more actual economics for them than the FOMO it feels like for the other companies.
The last 3 generations of nvidia gpus have been a big middle finger to PC gamers in terms of price and power usage
Gone are the days of affordable graphics accelerators in the $300 to $500 range. Now it’s $1000 to $2000. 400 watts now instead of 100.
"Pay more and get less" has been the trend
https://www.xda-developers.com/shrinkflation-is-making-nvidi...
That's even before you get into bullshit like fake frames
> "Pay more and get less" has been the trend
That article doesn't support what you're saying whatsoever. GPU cores going down at the same price point is the opposite of shrinkflation, especially when you consider the US dollar is worth ~40% less than it was in 2012. And VRAM prices aren't going down anywhere, especially now.
> bullshit like fake frames
Fake frames are an option. You can play at native 4K/8K resolution, with the same 2.25-4x cost in power usage and raster compute. It will be miserable, but that's your choice.
> Gone are the days of affordable graphics accelerators in the $300 to $500 range. Now it’s $1000 to $2000.
What are you talking about? nVidia only has two models in the $1000 to $2000 range and they’re clearly premium parts.
The $300 to $500 cards are actually fine for normal gaming unless you demand to play at 4K at high settings.
> The $300 to $500 cards are actually fine for normal gaming unless you demand to play at 4K at high settings.
I don't think that wanting to play games at the native resolution of your screen without changing settings from their defaults in order to make the game look and perform much worse is a very unreasonable "demand".
That used to be possible without spending as much money and it's also not unreasonable for people to point that out
Fair, but there are 2025 games that don't run even well on the 5090. This is the fault of game developers who think they're making the next Crysis, targeting some hypothetical future hardware instead of providing a great experience on today's midrange hardware.
Looking at the best looking games from today vs 10 years ago, they're so similar it's hard to see where that extra performance is even going.
So far waiting ~5 years to bother with them has been a working strategy for me.
> That used to be possible without spending as much money and it's also not unreasonable for people to point that out
That used to be possible when the most common resolution was 1080p and refresh rates weren't pushing 240hz+.
Pretty much all the lower price cards are a bad buy. Nvidia is only competitive on performance at the absolute top end, where they have no competitors. In every other price bracket they lose to AMD and Intel.
You're right.
People want to pretend fundamentals of economics don't exist AND the company has moral obligations to fulfill to consumers. It's laughable.
It's not just nVidia, I've seen other expensive consumer brands getting the same sentiments.
> I've been hearing this from Gamer's Nexus for decades
I liked the idea of Gamer’s Nexus at first when it was supposed to be a data-first rigorous independent journalism.
Somewhere along the way it turned into a constant grievance and outrage channel. I guess audience capture pays the bills and YouTube Drama is hard to ignore. I haven’t bothered with that channel since they tried to go to war with Linus Tech Tips. I don’t even watch LTT and I certainly don’t want to watch two channels go to YouTube war against each other when I’m just trying to hear how the latest coolers perform or something.
I think a lot of the ultra cynical HN comments about how it’s the end of computing or how gamers have been abandoned are coming from these channels, though.
It's about which market segment gets priority in the company. Doesn't mean they'll stop making gaming cards altogether
Sure, that makes sense. I don't think anyone ever treated Nvidia like the "pure raster" competitor though. Sacrifices have been made for CUDA for 10+ years, when the Nintendo Switch shipped it was with automotive grade SOCs. Gamers have been chopped liver for decades, but they still get GPU releases and software products.
Looking at the flip side, Apple, AMD and Intel all eschewed compute performance for raster and have nothing to show for it. No "DLSS killer" in sight, no CUDA alternative, nothing. It seems like the gaming revenue is a ball-and-chain holding back profitable applications.