Self hosting LLMs isn’t completely out of the realm of feasibility. Hardware cost may be 2-3x a hardcore gaming rig but it would be neat to see open source, self hosted, coding helpers. When Linux hit the scenes it put UNIX(ish) power in the hands of anyone with no license fee required. Surely somewhere someone is doing the same with LLM assisted coding.
The only reason to have a local model right now is for privacy and hobby.
The economics are awful and local model performance is pretty lackluster by comparison. Never mind much slower and narrower context length.
$6,000 is 2.5 years of a $200/mo subscription. And in 2.5 years that $6k setup will likely be equivalent to a $1k setup of the time.
We don't even need to compare it to the most expensive subscriptions.
The $20 subscription is far more capable than anything i could build locally for under $10k.