Apple has a great chance of being the personal AI machines.

If they managed to add inference chip that competes with Groq and similar hardware, they would make much bigger wins comparing to main their own models.

Imagine being able to run 32B reasoning models with hundreds of tokens per second on your local laptop.

I'm sure that is the future. But we're still pretty far away from being able to put a 32B model in memory right? The whole industry is on that trajectory. Just give it another decade and the standard for base memory will likely be 64GB or more.