Yeah, kimi k2 apparently requires 2TB of vram to run[1], and trails the proprietary models in terms of intelligence. There's no world where people are going to be replacing chatgpt or claude code with a local model.

[1] https://www.canirun.ai/model/kimi-k2