Do you have a source for this local stuff?

i can kinda see it, they spent a lot of time getting Gemma 4 pretty efficient and then seeing everyone buy macs to run them and realize it’s maybe a real moat since Apple doesn’t make any AI

Would be an interesting product if it could actually give you GPT performance locally, will be an awful experience if it’s essentially just cloud AI…like a premium laptop where most of the features are locked behind a subscription would be wild

They are already rolling it out in Chrome: https://www.pcmag.com/news/chrome-is-quietly-downloading-4gb... It won't work on a Chromebook with 4GB of RAM, so they need beefier hardware.

Yep, that's the answer. That being said I still imagine their preference is that nothing is run locally, all via their server to get all precious usage data.

They could provide a preinstalled harness, which can send them all the juicy anonymized (or not) data