We won't be in a supply crunch forever. We'll have a demand crunch. The demand of powerful consumer hardware will shrink so much that producing them will lose the economics of scale. It 've always been bound to happen, just delayed by the trend of pursuing realistic graphics for games.
People who are willing to drop $20k on a computer might not be affected much tho.
> People who are willing to drop $20k on a computer might not be affected much tho.
They probably won't, but those willing to drop $3-10k will be if the consumer and data-center computing diverge at the architectural level. It's the classical hollowing out the middle - most of the offerings end up in a race-to-the-bottom chasing volume of price-sensitive customers, the quality options lose economies of scale and disappear, and the high-end becomes increasingly bespoke/pricey, or splits off into a distinct market with an entirely different type of customers (here: DC vs. individuals).
My bet is that phone hardware will be used more and more in mini PCs and laptops keeping the cost down and volume up. We see it with Apple and many Chinese mini PC makers I looked at.
This is so true. Convergence will continue. H/W miniaturization will keep increasing. In fact, new brands could easily appear and even overtake the largest players. For example, have you seen this massive range of docking technology.
https://us.ugreen.com/collections/usb-c-hubs - these docks only require a single USB port to connect to. That could be a SBC working as a handheld. These docks could end up being the largest cost component in the new era of all-in-ones. UGreen could be the next Apple as screens and processors snap-on to these hubs, in addition to their own range of power banks and SSD enclosures. Their quality is high too.
In fact, I would go so far as to say we are entering a tinkering culture, and free-energy technologies are upon us as a response oppressive economic times. Sort of like how the largest leaps in religious and esoteric thought have occurred in the most oppressive of circumstances.
People will reject their crappy thin clients, start tinkering and build their own networks. Knowledge and currency will stay private and concentrated - at least at first.
RAM is going to be the most expensive component, I suppose.
But indeed, once you have USB-C support on your device, you can connect all kinds of periphery through it, from keyboards to 4K screens. Standardized device classes obviate the need for most drivers.
Yep. I was thinking that as crypto miners pivot into AI https://catenaa.com/markets/cryptocurrencies/jpmorgan-morgan... - there must also be a case for miners (anyone really) liquidating their hardware, including memory. So the price of memory has its own limits-to-growth - latent availability, but that's another topic.
If this ends up being true, desktop Linux adoption might make inroads. Windows apps run like crap on ARM and no one is bothering to make ARM builds of their software.
Because ARM Windows is locked down tightly. The same will interfere with Linux adoption on similar hardware.
The original Raspberry Pi was built around an overstock phone chip. Modern alternatives built around Rockchip and similar high-end phone chips venture into the territory of lower-end laptops. Aliexpress is full of entry-level laptops based on ARM phone chips (apparently running Android).
This will likely extend further and further, more into the "normie" territory. MS Windows is, of course, the thing that keeps many people pinned to the x64 realm, but, as Chromebooks and the Steam Deck show us, Windows is not always a hard requirement to reach a large enough market segment.
No, a set-top-box chip.
All we need is for HDMI to be unlocked so it works on phones, or maybe VGA adapters that work on phones. And a way to "sideload" our own apps. Hackers please make this happen.
Some modern phones do DisplayPort over USB C.
Unified hardware helps some and hurts some. See: same gpus for gaming and for AI.
Apple just launched a $600 amazing laptop and the top models have massive performance. What are we talking about here?
I don't think personal computers will go away, but I think the era of "put it together yourself" commodity PC parts is likely coming to an end. I think we're going to see manufacturers back out of that space as demand decreases. Part selection will become more sparse. That will drive further contraction as the market dries up. Buying boxed motherboards, CPUs, video cards, etc, will still exist, but the prices will never recover back to the "golden age".
The large PC builders (Dell, HP, Lenovo) will continue down the road of cost reduction and proprietary parts. For the vast majority of people pre-packaged machines from the "big 3" are good enough. (Obviously, Apple will continue to Apple, too.)
I think bespoke commodity PCs will go the route, pricing wise, of machines like the Raptor Talos machines.
Edit: For a lot of people the fully customized bespoke PC experience is preferred. I used to be that person.
I also get why that doesn't seem like a big deal. I've been a "Dell laptop as a daily driver" user for >20 years now. My two home servers are just Dell server machines, too. I got tired of screwing around with hardware and the specs Dell provided were close enough to what I wanted.
There are upsides here as well! I think of things like the NUC or Mac Mini - ATX is from 1995, I'm hopeful computers will become nicer things as we trend away from the bucket-o-parts model.
I'm very excited about the Steam Machine for the reasons you mention - I want to buy a system, not a loose collection of parts that kind-of-sort-of implement some standard to the point that they probably work together.
What are the upsides? You just listed a few things that you subjectively like, but not why they should take over all parts of the PC market. The only factor I can think of is size, but those small all-in-one computers are already widely available now without the need to hollow out the custom PC market.
There's nothing wrong with ATX or having interchangeable components. An established standard means that small companies can start manufacturing components more easily and provide more competition. If you turn PCs into prepackaged proprietary monoliths, expect even fewer players on the market than we have now, in addition to a complete lack of repairability and upgradability. When you can't pick and choose the parts, you let the manufacturer dictate what you're allowed to buy in what bundles, what spare parts they may sell to you (if any) and what prices you will pay for any of these things. Even if you're not building custom PCs yourself, the availability of all these individual components is putting an intrinsic check on what all-in-one manufacturers can reasonably charge you.
but I don't want a $600 amazing laptop, i want a powerful desktop x86 machine with loads of ram and disk space. As cheap as it was a couple of years ago.
> As cheap as it was a couple of years ago.
I also want housing as cheap as it was a couple of years ago.
You can have both. You just have to undo the forced bail-in of Millennial and Gen-Z/Alpha/Beta productivity to cover the debts and lifestyles of Silent Gen/Boomer/Gen-X asset holders. The insanity of contemporary markets doesn't reflect anything natural about the world's economic priorities, but instead the privileging of the priorities of that cohort. They've cornered control until enough people call bullshit. So, call bullshit.
x86 going away wouldn't be surprising. Ignoring David Patterson was a mistake to begin with.
Looking at AMD's x64 server offerings, I don't see why that would go away.
But I can imagine that it would become less prevalent on personal machines, maybe even rare eventually.
Not sure about the memory, but Xeon Scalable/Max ES/QS chips and their boards are still not horribly expensive.
Prior to the crunch, you could have anything from 48-64 cores and a good chunk of RAM (128GB+). If you were inordinately lucky, 56 cores and 64GB of onboard HBM2e was doable for 900-1500 USD.
They’re not Threadrippers or EPYCs,but sort of a in between - server chip that can also make a stout workstation too.
8GB isn't an "amazing" laptop, it's a budget laptop. It's also thermally constrained quite a bit, so not even as "amazing" as it could be.
The point about Apple is that everyone from zoom, slack etc will be forced to optimize for that 8GB. (Same like getting rid of awful flash player).
Many a people need only a basic device for Netflix, YouTube, google docs or email or search/but flights tickets. That will be amazing.
Many have job supplied laptop/desktop for great performance (made rubbish by AV scanners but that's different issue)
>(Same like getting rid of awful flash player).
I was looking up an old video game homepage the other day for some visual design guidance. It was archived on the Wayback Machine, but with Flash gone, so was the site. Ruffle can't account for every edge case.
Flash was good. It was the bedrock of a massive chunk of the Old Net. The only thing awful are the people who pushed and cheered for its demise just so that Apple could justify their walled garden for the few years before webdev caught up. Burning the British Museum to run a steam engine.
Reading some of the doomer comments in this thread feels like taking a glimpse into a different world.
We're out here with amazing performance in $600 laptops that last all day on battery and half of this comment section is acting like personal computing is over.
They don't run the software I want to run (Linux, Windows games) and/or with the performance I want.
Raspberry Pi is way cheaper than those things, and I'm sure you could hook one up with an all-day battery for $100-200.. Doesn't mean it's "better".
They trade blows performance wise with the M1 MacBook Pro sitting on my desk. And theres nothing stopping asahi linux running on them except for driver support. They look like fantastic machines.
They’re not ideal for all use cases, of course. I’m happy to still have my big Linux workstation under my desk. But they seem to me like personal computers in all the ways that matter.
Two different populations — those interested in computing, and those interested in computers.
Personal computing and IBM PC clones are not the same thing. The fall of PC clones can happen while other personal computing devices continue to be produced. The $600 laptop is not a PC.
Apple laptops are PCs (Personal Computers). They are not IBM PCs. But IBM hasn't made PCs in years, and there hasn't been any IBM PC hardware to clone in years.
If they choke the consumer PC long enough the segment will die
> We'll have a demand crunch
This is what I'm afraid of. As more stuff moves to the cloud helped in part by the current prices of HW, the demand for consumer hardware will drop. This will keep turning the vicious cycle of rising consumer HW prices and more moves to the cloud.
I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform. If a GPU is so expensive, you move to a rental model and the subsequent drop in demand will make GPUs even more expensive. They're far from the only ones with dollar signs in their eyes, between the money and total control over customers this future could bring.
Being entirely reliant on someone else's software and hardware is a bleak thought for a person used to some degree of independence and self sufficiency in the tech world.
>Being entirely reliant on someone else's software and hardware is a bleak thought for a person used to some degree of independence and self sufficiency in the tech world.
It's also a nightmare from any sort of privacy perspective, in a world that's already becoming too much like a panopticon.
> I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform.
Roblox is not popular because of its graphics. Younger gamers care more about having fun than having an immersive experience.
I love it when I get my Robloxhead daughter to test drive some of the games I play on my 5090 box. "Ooooh these graphics are unreal" "Can we stop for just a moment and admire this grass" :-D
I think we're talking about 2 different things. I'm not sure where Roblox fits into what I said.
The problem I describe is companies pushing towards the "rent" model vs. "buy to own". Nvidia was just an example by virtue of their size. Microsoft could be another, they're also eying the game streaming market. Once enough buyers become renters, the buying market shrinks and becomes untenable for the rest, pushing more people to rent.
GPUs are so expensive now that many gamers were eying GeForce Now as a viable long term solution for gaming. Just recently there was a discussion on HN about GeForce Now where a lot of comments were "I can pay for 10 years of GeForce Now with the price of a 5090, and that's before counting electricity". All upsides, right?
In parallel Nvidia is probably seeing more money in the datacenter market so would rather focus the available production capacity there. Once enough gamers move away from local compute, the demand is unlikely to come back so future generations of GPUs would get more and more expensive to cater for an ever shrinking market. This is the vicious cycle. Expensive GPU + cheap cloud gaming > shrinking GPU market and higher GPU prices > more of step 1.
Roblox is one example of a game, there are many popular games that aren't graphics intensive or don't rely on eye candy. But what about all the other games that require beefy GPU to run? Gamers will want to play them, and Nvidia like most other companies sees more value in recurring revenue than in one time sales. A GPU you own won't bring Nvidia money later, a subscription keeps doing that.
The price hikes come only after there's no real alternative to renting. Look at the video streaming industry.
Yeah, this gamer conspiracy theory never made sense to me.
Also, if gamers demand infinitely improving graphics so much that they would rather pay for cloud gaming than relax their expectations and be happy with, say, current gen graphics, then that is more a claim about modern self-pwned gamer behavior than megacorp conspiracy.
But I don't buy that either. The biggest games on Steam Charts and Twitch aren't AAA RTX 5090 games.
> then that is more a claim about modern self-pwned gamer behavior than megacorp conspiracy.
Riddle me this: does anyone pursue a self-pwn intentionally?
"Conspiracy theory" is just dehumanizer talk for falling prey to business as usual.
As someone who has been buying computers for 40+ years, including the 1st gen 3dfx card, etc, this is where I NOPE out of the next upgrade cycle. I am not renting hardware. It's bad enough ISPs are renting modems.
The problem is that there is a very large incentive for three large companies to corner the market on computing components, forcing consumers to rent access instead of owning.
> We won't be in a supply crunch forever.
This what always happens in capitalism. Scarcity is almost always followed by glut
I don’t believe we are seeing the investments necessary that would indicate this will happen.
Memory makers, for example, have sold out their inventory for several years, but instead of investing to manufacture more, they’re shutting down their consumer divisions. They’re just transferring their consumer supply to their B2B (read AI) supply instead.
Thats likely because they don’t expect this demand to last past a few years.
They have seen boom and bust cycles previously and are understandably wary of expanding capacity for expected demand that may fizzle. If they stay too conservative, China’s CXMT is chomping at the bit to eat their lunch, backed by the Chinese government, but that’s not going to help until late 2027 at best.
How much capital would you invest in a capacity expansion for a trend that may or may not yet be durable? Now, how much would you invest when there are two major state-backed chinese entities that essentially aren't allowed to go bankrupt and have infinity money are competing with you?
If the demand lasts for a few years, I’m doubtful that all of the consumer capacity will come back.
Consumer demand likely depends on how local models end up working out. Nothing else really needs serious local computing power anymore. My guess is that even high-end games will probably stagnate for a while.
Many users will not want to risk their privacy, data, and workflow on someone else's rapidly-enshittifying AI cloud model. Right now we don't have much choice, but there are signs of progress.
High level games are far from stagnating, when viewed from usable performance.
Many new games cannot run max settings, 4k, 120hz on any modern gpus. We probably need to hit 8k before we max out on the returns higher resolution can provide. Not to mention most game devs are targeting an install base of $500 6 year old consumer hardware, in a world where the 5090 exists.
That's what I mean by stagnating... most players already can't run with max settings, or even close to them. From the developers' point of view there's not much point raising the bar any higher right now, while the best GPU hardware is so far out of reach of your average PC gamer.