I'm still betting on a future AppleTV model being a full-on local LLM machine.
This way they could offload as much of the "LLM" work on a device that lives in the home, all family linked phones and devices could use it for local inference.
It's way overpowered as is anyway, why not use it for something useful.
Is it? The base $600 Mac and $150 Apple TV are easily two of the best deals in their market
I'm still betting on a future AppleTV model being a full-on local LLM machine.
This way they could offload as much of the "LLM" work on a device that lives in the home, all family linked phones and devices could use it for local inference.
It's way overpowered as is anyway, why not use it for something useful.