This guy might be the last CEO of Intel.

I don't really understand this.

Intel was (and arguably still is) too large relative to its current technical capabilities. Yet even in this current “bad chips” era, Intel is only, at worst, about 10% behind in gaming performance (largely due to cache disparity) and is on par or better in most other workloads. From the K10 era until Zen 3, AMD processors were objectively worse (sometimes comically so) and AMD still managed to survive.

Intel’s mobile CPUs remain extremely competitive. Their integrated GPUs are the fastest in the x86 space. And their SoC+platform features: video decode/encode, NPUs, power management, wifi, and so on are the best in class for x86 CPUs; they are usually a solid second place or better regardless of architecture.

Subjectively, the most interesting “mainstream” laptops on the market are still, and historically have been, Intel-based. I understand that in an era where the M4 Max, Snapdragon 8 Elite, and Strix Halo each serve as best-in-class in different segments, “mainstream appeal” no longer equates to market dominance. And that is bad news for an Intel that historically just make a few CPUs (the rest being market segmented down versions of those chips), but still, to suggest they will disappear overnight seems... odd.

> Yet even in this current “bad chips” era, Intel is only, at worst, about 10% behind in gaming performance (largely due to cache disparity)

Gaming is irrelevant.

For AMD, gaming (both console and PC combined) is less revenue than embedded-- things like those routers you can get off of aliexpress and Synology NASes.

Enterprise, cloud, and AI are the only things that matter, and even enterprise is falling off.

Back in 2020 with the second wave of AMD EPYC Rome, after I had gotten a couple of R7525s in hand and put them through their paces I started saying that you are professionally negligent if you, as a technology professional, recommend an Intel solution unless you have some very specific use cases (AVX512/Optane-optimized options). In 2022 everyone started agreeing with me.

Now you are professionally negligent if you recommend Intel at all.

Enterprise cares about speed, cloud cares about clients per socket, and AI cares about bandwidth. Intel is not competitive in any of those.

Even in the consumer space, for running bullshit workloads like Copilot on a laptop the difference is negligible. Intel is ahead, by about 10%-- at ONE HUNDRED AND SIXTY WATTS (if the OEM even allows it) while you trade that 10% for 75W on AMD.

No human being on earth cares that the scan to identify if there's a cute dog in the photo they just saved to disk takes .255 or .277 seconds. They do care about battery life.

And gaming isn't just irrelevant due to revenue, once you look at margins you start realizing that AMD could never again spend a single cent on marketing X3D chips to gamers and instead redirect that money to target other sectors and they would probably be better off for it.

Look at Nvidia. Gaming went from their cash crop to burdensome baggage in just a couple of years. Gaming went to less than 9% of revenue from like, 80%. They don't care about people buying an RTX card and having to deal with OEMs and distributors and retailers and marketing and RMAs and driver patches at whatever piss-poor margin it is due to everyone taking their cut when enterprise clients are putting in POs directly to them for tens of thousands of Data Center cards at a time at high margins-- and they didn't have to spend barely anything on marketing.

The very last thing, after figuring out absolutely everything else, that Intel should care about is what their chips are benchmarking at in the latest video game.

I’ve found that a lot of my friends who are into pc gaming still haven’t grokked that ever since the crypto boom, much less the ai boom, that they are the old toys that no one wants to play with anymore.

I had a friend who legitimately could not understand why Nvidia didn’t care about their reputation in the gaming market souring even after I showed him the numbers on how much nvidia is selling to corporations now.

I don’t know if it was an inability to deal with the numbers or if it’s just culture shock at going from being a valued client to as you said “baggage”, but it was a surprising number of people in that camp

Too true, but at the same time... gamers didn't disappear. The market is still there.

Sooner or later, someone else will fill the need. That may be AMD, it could be Intel if they just focus for more than a year, or it'll be some cheap Chinese GPU from a company you've never heard of. (Likely named by mashing the keyboard while capslock is on.)

It's like how the mainframe market is bigger than it has ever been, despite being an irrelevant rounding error in the minds of the "Wintel" server providers, cloud vendors, etc...

Well this all did happen in the blink of an eye. 2022, Gaming was one of the only things booming after a global shutdown. 2023, investors all at once jumped ship to chase AI. That can be rather shocking even for tech, since people don't tend to upgrade their GPU's every year.

As a parallel, imagine hearing that the IPhone 13 was the biggest selling device in history. Then suddenly the IPhone 14 is $4000 and mostly sold to enterprise. It doesn't make any logical sense without following the money. Even then it may not make much sense.

I mean, for a long time the situation was reversed.

Huge gaming demand and easy retail availability of nvidia's cards was providing economies of scale. If a few professors were buying the GeForce 8800 to look at this new 'CUDA' thing that was mostly a marketing thing.

Around the same time there were also one or two Playstation 3 clusters - but a year or two later Sony removed support for that. HPC being inconsequential, and a distraction from their core business, presumably.

It's only in recent years the stuff that used to be marketing decoration has become reality.

Gaming is still a multi billion dollar industry, and touches into all the other aspects you mentioned as well. Losing out to AMD for Xbox/Playstation was definitely a costly loss.

>Look at Nvidia. Gaming went from their cash crop to burdensome baggage in just a couple of years.

Yes, marking up your consumer hardware by 4-5x to appeal to crypto miners surely does have an effect on your market. Arguably, AI saved them from a crash due to their over investment on Crypto/NFTs. It's not like gaming demand diminished this decade.

Gaming isn't THE way out. But it's one avenue to consider. It does seem like companies c. 2025 prefer to fall into the AI bubble, though.

I wonder what risks the bubble has for them. If they can sell every $30K AI accelerator they make right now, that might cause them to overextend, committing to up-front capacity or long term projects that are financed by the current spend patterns, or just neglecting other parts of their product line.

If the hype dies and they're back to selling 5090s to gamers, can they afford to pay those bills?

That's probably the saddest part. They can still pay thrd even pay off the debts from the bubble bursting just doing what they used to rely on.

But we know that won't be enough for shareholders and their stock would tank regardless. Because 2020's speciation isn't about having a reasonable long term portfolio. It's just extremely abundant pumping until you need to dump and pump the next trend. It's not enough these days to be a good, sensible business.

Historically the answer has been "no". When a company pivots to doing something that becomes 90% of their revenue, there is no way to go back to doing whatever the 10% was. Imagine NOKIA going back to manufacturing gumboots, which is how that company started out!

I kinda see what you mean.

Steam [1] tells me gamers use Intel by 59:41

Price-performance scatter plots [2] say although Intel isn't battling AMD for the >$1000 threadripper territory, they have some competitive products in the sub-$500 price band.

And while Intel missed out on the smartphone market, I've heard people comparing their N100 CPUs favourable to the latest Raspberry Pi hardware.

Sure, Intel has had major troubles with their next process node. And one of the best performing laptops is ARM-based. But Intel are nowhere near defeated.

[1] https://store.steampowered.com/hwsurvey/ [2] https://www.cpubenchmark.net/cpu_value_available.html#xy_sca...

The sad thing is that, from what I can tell, Intel doesn't have a true planned successor to Alder Lake-N.

It really might be as bad a mistake as not having Intel Isreal's futher development of Pentium 3 would have been. (in other words, no Pentium M, no Core 2 Duo, no Nehalem...)

> Intel was (and arguably still is) too large relative to its current technical capabilities. Yet even in this current “bad chips” era, Intel is only, at worst, about 10% behind in gaming performance (largely due to cache disparity) and is on par or better in most other workloads. From the K10 era until Zen 3, AMD processors were objectively worse (sometimes comically so) and AMD still managed to survive.

The current “bad chips that are only 10% behind” are fabbed by TSMC, not Intel.

"last CEO" is hyperbole. But despite the competitiveness of some of their latest offerings, their trajectory is beyond concerning.

"in the x86 space", otherwise you would have to acknowledge that M4 far outpaces any intel iGPU.

I did acknowledge it? I said best in the x86 space and second overall. The "raw" iGPU ordering is M4, Lunar Lake, Strix Point, and finally 8 Elite. Of course, numbers aren't everything. If one actually were to pick an iGPU for gaming they would be best served by Strix Point.

I think the M4 is a fanless marvel of a chip and noticeably more interesting than the M4 Max. A fanless 6+2+10 configuration M5 with 128gb of ram would the most interesting thing in the mobile space.

But since we are splitting hairs, how good is an iGPU if you can't play most games? x86 -> windows or proton. One can't even run Linux, let alone proton, on an M4 (Asahi support stops at the M2).

If we're comparing incompatible platforms, then the Apple M4 Max's iGPU is weaker than the Playstation 5 Pro's AMD iGPU in everything except for memory capacity.

Intel has a competitive iGPU in the low-power mobile space. Their iGPUs in general are also pretty solid for general desktop use. But even in the x86 space, AMD has better-performing iGPU options than anything Intel has ever offered.

In the past AMD needed to survive for antitrust reasons. Now x86 is losing in relevance now as alternatives are established. Nobody needs to keep intel alive.

AMD also received many Hail Marys as a result of Intel’s anticompetitive behavior. Directly via payouts Intel and partners had to make, and indirectly via companies being more willing to work with them for their GPU expertise and better (out of desperation) licensing/purchase agreements.

Intel can’t rely on the same. They haven’t been directly impacted by another larger company, they rely too much on a single technology that’s slowly fading from the spotlight, and they can’t compete against AMD on price.

Maybe if they ended up in a small and lean desperation position they could pivot and survive, but their current business model is a losing eventuality.

AMD could not afford their own foundries anymore. The same is likely to happen to intel. The CPU business may be sold off to some other company, so x86 and intel will "survive" for sure but they will rely on other fabs to produce and they will milk the legacy cow instead of holding the overall performance crown.

Did you completely ignore the last paragraph?

As I said, AMD survived by going into a lean pivot out of desperation. Intel has that opportunity as well, but the deck is stacked against them due to their size and over-reliance on specific IPs.

Which alternatives? Other than Apple, where can I get a non-x86 desktop?

1. Desktop market share is shrinking and shrinking. 2. https://system76.com/desktops/thelio-astra-a1.1-n1/configure 3. NVidia N1x is not yet for sale but benchmarks are promising.

1) Shrinking compared to what? The moment you want to do any serious work or gaming, you need a desktop (or a laptop, but a real PC in any case).

2) Ok, so there is expensive workstation available. It is a step forward I guess.

3) Call me when it is available and I can buy it in any normal computer shop.

Look, I hate the x86 architectur with a passion, having grown up with MS-DOS and the horrors of real mode. But the truth is that if I want to buy a computer right now, today, it is either a x86 PC or an Apple, and I have zero interest in Apple's closed ecosystem, so a PC it is.

Does the Nvidia DGX Spark qualify as a desktop?

Technically yes, but I don't see the average person getting one. Much like the Raptor Talos, it is a very niche product.

Intel has 75% of the desktop/server chip market. They'll survive.

Intel has only about half of the server market at this point, and that's with their products priced so low they're nearly selling them at cost.

The margins on their desktop products are also way down, their current desktop product isn't popular due to performance regressions in a number of areas relative to the previous generation (and not being competitive with AMD in general), and their previous generation products continue to suffer reliability problems.

And all this, while they're lighting billions of dollars on fire investing in building a foundry that has yet to attract a single significant customer.

Intel's not in a good spot right now.