Depends on your use case though. You don't always need the best. Even if you have a hypercar, you probably drive a regular car to work.
There's also a personal good enough point for everyone who's hoping to cut the cord and go local. If local models get as good as current moments Claude Sonnet, I would actually be totally fine using that locally and riding the local improvements from then on.
And for local stuff like home automation or general conversational tasks, local has been good enough for a while now. I don't need the hypercar of LLMs to help me with cooking a recipe for example.