There are a few hiccups, but everyone'e acting like Apple Silicon wasn't one of the most wildly successful overnight improvements in laptops in the last 50 years and that airpods aren't so popular that even Android phone users buy them.

I agree. As usual these days, the culprit for lack of innovation is AI (mostly LLMs). Apple did try to shove it to the users unsuccessfully and they had the right amount of sanity to backtrack once they realized how the users' expectations are misaligned with what LLMs have to offer, which is mostly lack of reliability; average user thinks that something walks like a duck, quacks like a duck must be a duck, but LLMs, while pretending to be human-like are not like that. On the other hand, Microsoft is still forcing it upon their users with Copilot but users are resisting. We will see which tactic will win in the long-run but I would bet on Apple.

If history is any indication, Microsoft will slowly improve until things become sort of decent, then Apple will release a neutered but polished version that is lauded as "Apple Invents AI" and immediately dominate the market.

But their version will lack one critical feature that everyone expects, such as a power button, or copy/paste, or pinch to zoom. For AI, maybe something like "remember context".

Then 3 years later they will invent "remember context," and everyone will lose their shit again.

The article addresses Apple's historical late mover advantage strategy, and suggests that Tim Cook doesn't have the same vision that Jobs had to pull those moves off.

Cook isn’t a visionary, he is an ops & logistics guy. He has a major problem with failing at AI and lying last year about “Apple Intelligence”.

He needs some fresh blood in software as well.

He should just buy something like Claude and lock them in with stock and very high salaries.

He should also stop scrimping on integrated memory on the M series. Overpriced memory is a huge issue. And do something about the GPU as well.

Perhaps an AI N series with 128gig of ram, & dramatically more powerful GPU with higher core/thread count.

Buying competency and hoping for the best sounds exactly what an economist would do.

They will probably leave as soon as their manager becomes just the run of the mill babysitter manager with charisma.

So true. That was Meta's all too generic strategy for the Metaverse, throw tons of money and people at the problem. And it worked as poorly as one would expect.

Now Meta is doing it again for AI.

Apple won't throw money down a hole like that, but they definitely need a more original and opinionated strategy than just an increased talent count.

The lack of love they gave Siri and AI in general, since they bought the original Siri tech in 2010, just keeps looking like a worse and worse oversight.

Ah yes, Jobs famously could solve the LLMs being shit problem overnight.

By convincing users they want hallucinating, gas lighting imps in their pockets 24/7 to reinforce their thoughts.

And he would have been all over it given his own alt-medicine beliefs.

He is dead, we cannot be certain.

>...backtrack once they realized how the users' expectations are misaligned with what LLMs...

Not my experience as an Apple user. The issue was not that I expected something better than an LLM. It was that I'm used to ChatGPT and the Apple AI was rubbish in comparison.

And if did useless stuff like mangled summaries of messages. If I could set if to know what files I have and web pages I'd used and could ask "which file do I have the holiday records in?" etc. then that would be useful.

Airpods were many years ago, as was the MacBook pro redesign.

CEOs are judged on today. Past achievements are past achievements and he was paid handsomely for it.

Tim Cook is a money guy. Saving money by reducing per device licensing costs via creating own chips and also now their own modems is a money guys domain.

Hardware and software innovation is another skill.

Tim Cook wasn't appointed on his ability to innovate. But he needs to ensure he has individuals who can do so, and that there aren't layers of management or himself obstructing them.

[deleted]

Apple Silicon was definitely a massive leap forwards but I wouldn’t say the competition has stayed still either.

And the innovative part was basically going system-on-chip, as even Microsoft had tried to change to ARM architecture before. It was a natural consequence of designing for mobile devices, and a massive marketing campaign to sell it as something brand new.

Yeah everyone complains about how Apple don’t “innovate” but currently they are just focussing on making the best consumer hardware of all time. Don’t get me wrong I dislike them for other reasons but those reasons are mainly things Jobs pushed for decades ago. An Apple fan should love the current Apple

They also had a record breaking quarter for their services, when streaming services everywhere are bleeding out

Yeah I guess it’s been a while since their last iPhone moment but that entire decade was just one revolution after another for every segment of tech, and Apple came out on top

With their in-house chips they’re going to do fine for AI, even if they need to partner with someone else on it

> focussing on making the best consumer hardware of all time

Yes, but what use is the best consumer hardware when the software is degrading?

Considering the liquid glass thing, there's no one at Apple left who understands and cares about usability, for example.

Plus the iOSification of Mac OS for no good reason... maybe to make it less attractive to developers?

This. A good device is a combination of hardware and software. If one of those is lacking, the whole product is lacking.

I won't deny that Apple's hardware is great, but the software has been severely crippled by commercial rather than technical reasons. People have to go out of their way to glimpse outside the walled garden.

Facts. I switched to Mac primarily because of the CPU. Look at the state of Intel...

I will never buy Intel, I will be good with AMD. How and when, I do not know, as I do not have the financial means to get a new rig. My PC is considered ancient at this point and unfortunately to my medical conditions, I highly doubt it will happen anytime soon, if ever, unless someone helps me out with enough money, oh welp.

Both of these products could be expected from any well run company. They are both very polished versions of something that already existed. The direction to optimize in was already clear. No one wants to compare Apple to a generic well run company, they want to compare Apple to itself a decade or two ago.

Others have said that if Jobs were still alive, AR would be ubiquitous by now, and everyone would have a stylish pair of Apple glasses. I think that is exactly right.

Instead there is an incredibly expensive VR scuba mask, with relatively little adoption. It's certainly not changing the way we use physical spaces and transforming society, which is something a previous Apple could have pulled off. Users and developers need to be shown how to get value from something radically new, and Apple hasn't done that recently.

I think the Apple Silicon transition is far from something that could be expected from 'any' company. Microsoft already tried something similar with ARM Surface machines and the whole attempt was an absolute failure.

> I think that is exactly right.

I think that's entirely wrong. The hardware just isn't there yet! The AVP is the closest you can get to real "AR glasses" at the moment (as distinct from the Xreal 'non-context-aware screen overlay with a tiny FoV and fixed position'), but it turns out the hardware needed for that is >1 lb of stuff.

Developers working for Apple platforms are used to their software being broken by Apple every once in a while, so they need to update it to match the latest OS’s expectations. In the Windows world, 30-year-old Win32 apps can still run on Windows 11, as long as they don’t use any egregious hacks. And if they stop working on an Arm PC, Microsoft will be to blame.

It’s not even new. Apple transitioned from PowerPC chips to intel and basically did the same thing again. It’s a technical achievement to be sure. Apple users are unfortunately used to ditching software because backward compatibility isn’t something they strive for. Old Powerpc, 32 bit iOS/Mac osx software for example.

It is a failure because contrary to Apple, the people on Microsoft platforms value backwards compatibility, that is why it is holds 70% of the global desktop market.

Yeah this is a bonkers take.

AR is vaporware. The form factor sucks, the power requirements to do anything meaningful are too high and there just isn't a compelling use case for most of the population. I'd argue that Vision Pro makes the most sense out of anything, being more work rather than consumer oriented.

AR is a folly. Meta almost tanked themselves by going all in and all they managed were a tech demo.

As for the other point, well ipod and iphone were both just "polished versions of what already existed". Thats kind of what Apple does...very very well. AirPods are bigger than Spotify. Apple Silicon was about optimizing hardware, software and supply chain, its really a thing of beauty how they managed to pull it off and completely disentangle themselves from Intel and Nvidia.

Even Jobs wasn’t perfect. NeXT machines were technically amazing and also beautiful but they never really found many customers. He also thought the Segway would transform society but it ended up being sort of a joke and best known for being used by fat mall cops.

NeXT software though became the base of OS X.

BTW Mac Pros did not find many customers either. I bet Silicon Graphics did not sell very many boxes; it was important who bought these boxes.

Yes but the software wasn’t selling until it was forced upon the Apple customer base. You could buy NeXTSTEP for an Intel PC in the 1990s but nobody did.

The right people bought NeXT though. Carmack and his team developed Doom on NeXT computers [1], and the result has profoundly changed the mass-market PC scene.

[1]: https://en.wikipedia.org/wiki/Doom_(1993_video_game)#Engine

In addition, NeXT had a successful pivot to selling a web server framework named WebObjects, which had many big-name customers such as Dell (which infamously abandoned WebObjects once Apple purchased NeXT due to the optics of having an Apple competitor’s web store backed by an Apple product).

It’s conceivable that had Apple not purchased NeXT, even though NeXT probably would’ve ended up getting purchased by another company, its technology would’ve likely lived on. Perhaps a 1998 or 1999 NeXT could have open-sourced the OpenStep API and WebObjects as a Hail Mary move…talk about a completely different “what if” path for the Linux desktop and server!

You're missing OpenSTEP from that picture.

On which NeXT and Sun collaborated on, and thanks to that collaboration, and WebObjects (Java port), Java and Java EE came to be.

Patrick Naughton on what actually influenced Java's semantics and object model,

https://cs.gmu.edu/~sean/stuff/java-objc.html

Distributed Objects Everywhere genesis, and its evolution into Enterprise JavaBeans.

https://en.wikipedia.org/wiki/Distributed_Objects_Everywhere

While in parallel, NeXT refactored WebObjects into Java as well.

> Both of these products could be expected from any well run company.

I was not expecting Apple silicon from Apple, I didn’t even realize they’d been hiring chip designers for the past decade before that.

Personal computing really sucked before the mid 2000s. You had windows computers which were cheap and fast, but had lots of bugs and were hard to use and didn’t do low energy very well, you had Linux as a powerful novelty, you had expensive well designed Mac’s that were sort of slow but with better battery life. It took Apple’s intel transition to fix all of that, and I basically thought it was finally done and personal computing all of a sudden got very boring. But today I’m running mid-sized LLM inference on a Max 3, and it’s all very exciting again.

What is ARs 1000 songs in your pocket moment though?

There is no thing that needs the ubiquity that Jobs would have channelled the idea into.

In my opinion it would just never have seen the light of day.

> Others have said that if Jobs were still alive, AR would be ubiquitous by now, and everyone would have a stylish pair of Apple glasses. I think that is exactly right.

I don't think this is true. If anything, we'd have ipads with no stylus. Steve Jobs was a visionary, but most importantly a good business man. It's not like everything that made the iPhone was in house and strictly an Apple invention. Apple was lucky in that it bought a company that had developed the multi-touch technology the phone so heavily relied on. Without that, the iPhone wasn't going to happen when it did.

Additionally, a lot was sacrificed to make the iPhone happen. People took some serious physical and mental tolls in order to help ship that product. Marriages crumbled. The pressure was ruthless.

AR has a lot of limitations, that's what they were trying to sidestep. To be able to take a frame of the environment, composite the virtual frame and the real one in unison with various blending ops at your disposal. And finally, present the user with a properly composited image that works anywhere.

Now, if you use multiple lenses, with one masking in grayscale the real frame positioned just before the lens in which you project the virtual frame, you can do some limited blending ops. But it's quite difficult to deal with parallax if the lenses aren't glued together, or even then with refraction.

Apple took a very conservative approach. And made it to market. Now look at the competition. Sure, they have concepts with good compositing, but the ones in the market right now aren't able to produce the same imagery that Apple Vision is capable of.

Maybe I'm wrong, haven't looked at the market in some months, but I always thought of the Apple Vision as a very pragmatic design trying to circumvent AR limitations by being a VR headset.

Most generic well run companies would never take the risk & expense entailed in a platform transition like Intel => Apple Silicon.

You might argue that generic companies are doing that now as we see Windows ARM laptops finally, but I think they only got the courage to do that because Apple went first.

I mean there was Windows on ARM laptops previously. The first Snapdragon 835/8cx/7c Windows laptops predate the M1 lineup. They certainly weren't POPULAR or FAST but they existed.

https://www.cnet.com/reviews/asus-novago-review/

isn't the VR Scuba mask dead now?

I think Apple is hitting a wall that most tech companies are hitting. New gadgets aren't really improving our lives.

We have the internet on our phones, Now the internet is a wasteland.

Probably the biggest cultural innovation of the last 5 years has been podcasts and that's just gussied up radio.

Podcasts were arguably mainstream back in 2014 with Serial.

They definitely were, I remember everyone and their grandmother listening to Serial.

And Apple was involved in that as well, even if it was not planed. The pod in podcast coming from iPod after all.

Apple Silicon was only an improvement in battery life. AMD and Intel still made more powerful processors coupled with Nvidia and AMD GPUs that blew Apple Silicon out of the water. Also depends what you consider successful since Macbooks are still a small fraction of the laptop market.

Airpods being dominant is something from many years ago. Sony, Bose, even Samsung caught up quickly and offer much better integration since half the features are locked to iPhone only. Airpods were early to market and assisted by forcing users to buy them since their phones no longer had headphone jacks. Which boosts their word of mouth marketing.

> Apple Silicon was only an improvement in battery life.

For a company selling huge quantities of mobile computing devices, that's paramount.

I was wondering how long laptop computers have existed. The Epson HX-20 (1981) and Grid Compass (1982) seem like good early examples - the latter introducing the clamshell design used in many subsequent laptops such as ThinkPads.

But has Apple Silicon justified the investment? I don't doubt it's good tech, but the market share hasn't necessarily reflected it. In 2024, total shipped units dropped quite a bit. Judging by the available data, I wouldn't use the term "wildly successful" just yet (particularly if we account for pre/post pandemic sales).

I think their problem is that laptops and computers are so well made these days that there is no real need or want to upgrade.

Build quality is definitely factor, but that probably dates back to the 2008 unibody laptops. An equally big factor is that most specs have plateaued: retina screens can't get much better, SSD is fast enough for most data, processing is fast enough to play HD video. M1 was worthwhile upgrade for many because it improved battery life to be more phone-like.

Isn't that true, before I bought my current M1 MBP in 2022, I was on a 2014 13" MBP.

[deleted]

Haven't macs had that reputation for a while?

I had a Macbook Pro with a butterfly keyboard before and it was terrible and would break constantly and my Applecare was offered without accidental damage warranty and only lasted 3 years.

Now I have a MacBook Pro and I pay a small annual fee and have full coverage Applecare with accidental damage and if the battery goes out they replace it, etc. The only reason I would upgrade at this point would be for a built in cellular modem or face id.

Yeah, but the ARM Macs offered such an obvious advantage over their Intel predecessors that (I guess) a lot of Mac owners made the switch around the same time.

But today, even the first 2021 M1 MBPs are still very good devices (I still don't feel like I need to upgrade to a more recent model).

What would make me buy a new Mac is a good OLED display. The display in my new Windows laptop is the one component that's better compared to my 2021 MBP.

PA Semi was acquired for $278 million in 2008. I'm going to stick my neck out there and say the investment was indeed justified.

Apple Silicon is just a marketing name for the computer variants of the ARM chips for iPhones. How much of an extra investment could it really be to make a better version of a chip with less cooling and power constraints than the same chips that power phones?

Of course they were worth marginal investment even if you just consider they now only have one architecture to support, they are better computers and margins are better since they don’t have to pay Intel.

People are acting like Apple Silicon actually matters to Apple's profit margins. Since when was the Macbook a serious contribution to Apple margins?

As usual a Silicon Valey point of view, most of the world the reality is a bit different.

And the Apple Silicon chips even include specialized cores that allow LLMs to run locally on both iPhones and MacBooks. (See the "Foundation Models Framework".)

Which was a day late and a dollar short, even at release. Those ANEs are only really good at inference, and even then you get faster results using your Apple Silicon GPU. They're slow, incomplete, not integrated into the GPU architecture (like Nvidia's) therefore killing any chance of Apple Silicon seeing serious AI server usage.

If you want to brag about Apple's AI hardware prowess, talk about MLX. The ANE was a pretty obvious mistake compared to Nvidia's approach and hundreds of businesses had their own, even before Apple made theirs.

These chips were designed to be SoC/SiPs for consumer hardware. It feels odd to judge them for their suitability in servers.

The cores and architecture was designed for smartphones, it got put in desktops and rackmount servers anyways. I can judge it for whatever the product is, it's not a untold mystery why Apple Silicon servers aren't flying off the shelves in the AI boom.

The ANE is not designed to be fast. Nothing on the M-series is designed to be fast. Anytime it's fast is a lucky accident.

It's designed to have optimal performance/power ratios. The GPU is faster but it uses more power.

[deleted]

It makes a lot of strategic sense but if you have so much money on the bank, then who wouldn't buy a silicon design firm to design your CPUs in house? I bet everyone here on HN would have suggested to do that if asked in a board meeting. The big surprise was how long it took Apple to bite the bullet.

Anyway, despite this, Apple still has their castle built in someone else's kingdom, namely TSMC.

From a CEO pov this move makes no sense, as evidenced by the fact that literally no on else does it.

It goes against all MBA orthodoxy.

Nothing wrong with being fabless. Plus Apple could switch to inferior silicon and probably convince its userbase that they are the creme de la creme.

> Nothing wrong with being fabless.

Sure, if you're able to buy all the required production slots, as Apple has been doing in the past.

However, with new market pressure from AI this might not be possible in the future ...

What company has the money to outbid Apple for production slots?

See C1 for an example.

That was 5 years ago and the chips themselves are much older than that. They can't skate on that forever, especially since the competition caught up.

Where has the competition caught up? Arent Apple's chips still by far best-in-class from the laptop down to the watch?

Its an honest question, I would love an arm based laptop running NixOS that is competitive with the upcoming M5 Pro's but I dont see it anywhere.

Case in point: Google's newest Google TV box (released this year) is an absolute turd when compared with an 8 year old 4k Apple TV box.

The main innovation was buying all 3nm production from TSMC. Snapdragon chips on the same node perform as well.

The biggest stunt Apple pulled was making people think this is Apples new CPU designs that created the performance gap and not just buying all the production capacity for next gen fabs.

We used to get that same performance gain every few years just from regular process improvements, but this time Apple made it seem like it was Apple that made the gain and not the general world.

Your argument would be much stronger with one single Amazon link to a non-Apple product that is competitive on price, perf, and thermals.

Oh great, where can I buy a non-Mac with the same performance, battery life, thermals, weight, premium materials, and a trackpad that doesn’t feel like styrofoam coated in oil?

Since you say the competition has caught up, I assume it’ll be easy to send an Amazon link to the evidence.

Not OP but that's the thing right, you've talked a lot about feels which means there will never be an answer that overcomes the emotion built in your head.

I wonder what it would take to win you over, and i don't mean the rational you, which i'm sure you can justify, i mean the irrational 'i'm in love with the product' you.

It’d take these very objective things:

- 7+ hour battery life while running a bunch of Docker containers and IDEs, at the same screen brightness nits I run my MacBook at.

- Those things run not-perceptively-slower than on my Apple hardware

- No perceptible fan noise while doing the above

- The chassis does not creak when I pick it up from one corner

- Does not weigh more than double my MacBook

I’ll even relax the trackpad requirement and allow a 60Hz HiDPI screen instead of the 120Hz screen my MacBook from four years ago drives while doing all of the above.

Is there anything that can check those boxes?

This is a much more definable list, no feels there. (I don't know.. as am not hunting for a similar option, although I probably will be in the future).

> - The chassis does not creak when I pick it up from one corner

This would be an INSTANT 'nope' from me.

Every Fortune 500 company I’ve worked for (three so far) has sent me a Windows “dev machine” laptop with huge RAM and an i7-whatever processor… and they all creak if you pick them up from a corner. And the screen hinges all make weird sounds.

It’s not, like, a usability concern, but it’s something I’m definitely willing to fork over a couple hundred bucks to not feel every day.

"A lot" being in one of six items, specifically about an element that you touch and quality is generally judged by how it feels

Apple Silicon either followed Microsoft's direction (who pushed for ARM laptops a solid 8 years before M1) or just did a better iteration than Intel at the same time Intel was struggling.

Regardless of how you want to frame it, there clearly wasn't any grand innovative strategic vision in play there from Apple. It was just an incremental improvement to long established products with no shortage of equally significant and impressive increments improvements from others along the way.

It's a bit like saying that an iPhone was not a big deal, phones were becoming smarter by the day, Apple just did a better iteration in 2007 than Nokia and Blackberry.

Making an iteration so much better is not something I'd ascribe to luck.

iPhone was a new UI focused around capacitive touchscreens which at the time was itself an entirely new category.

What, exactly, did M1 change about anything about using a laptop? Longer battery life? Faster? Okay, same improvements as had been featured for the last 20 years. And it wasn't any thinner, depending on the model line it was substantially thicker even.

So what exactly was new about M1 that wasn't just a bog standard iteration we'd seen dozens of times by that point?

Even just considering Apple's product line surely you'd have to rank things like Intel's Core 2 as more significant, as it enabled the creation of the MacBook Air and was Apple's return to subcompact laptops. Or Intel's thunderbolt which radically changed the entire I/O story and capabilities for Apple, who fully embraced it.

I went from a 2016 intel MacBook Pro to an m1 and although the differences on paper looked like many other spec bumps, the actual experience felt like a paradigm shift.

It’s the first time I owned a laptop that lacked compromises. It was consistently snappy and fast at every task from a full to empty battery. And it did so without burning my lap or producing an incredibly annoying fan noise.

Yes, it's an excellent laptop. But being good and being innovative are not the same thing. There's nothing innovative about the M1 lineup. It's better on the same axis that laptops have been improving at consistently.

As for loud fan and burning laps, though, do also keep in mind that Apple was particularly bad in the 2016 generation, which is why the M1 laptops got thicker.

> What, exactly, did M1 change

For instance, the unified CPU and GPU memory, which allows to run ML models on a Mac as if you have a large dedicated GPU. (Unified memory of course harks back to the 8-bit era; the key was to make it performant.)

> For instance, the unified CPU and GPU memory

Was already common and widespread with iGPUs in CPUs on x86, and was standard on all ARM mobile SoCs for a solid decade.

AMD in particular had already done a bunch here with their APUs including OpenCL support in 2012 and fully coherent shared address spaces with 2014's Kaveri

Pretty common in PC integrated GPUs, game consoles.

The change is that it no longer depends on Intel and AMD, and Intel in particular was basically incapable of doing anything at the time. So now they can improve on a yearly basis reliably.

Security is also much improved over the Intel laptops for various obscure and technical reasons.

There's nothing magic about "ARM" nor is it a single thing. ARMv8 is quite different from ARMv7 and was designed for the purpose of making desktop chips like the M1. But you can make something that good for any ISA if you design it well enough.

ARMv8/v9's main advantages over Intel are in security, not performance.

Rosetta 2 was the big innovation there. Not ARM in a mobile device.

Surely you realized the "2" in Rosetta 2 isn't just quirky branding but because it's the second time they did that very thing? How was it a "big innovation"?

Was that even innovative? I mean, this is the third time they've changed chip architecture, they've had some practice here.

The first time I wrote anything for Mac OS, it was with Metrowerks' Code Warrior whatever-the-student-edition-was-called, where the compiler only targeted the 68k series chips.

20 years ago, as a senior engineer, I was given a MacBook with an Arm processor inside it, and asked to evaluate it for Aperture. I gave it a “meh” review, and that was the last I heard about Mac on Arm for a while.

Apple might not have been publicly pushing Arm, but they very much were not following Microsoft’s direction. It just wasn’t good enough.