Running local LLMs on laptops still feels like early days, but it’s great to see how fast everyone’s improving and sharing real setups.