I have different model Raspberry Pi's and I'm having a hard time justifying buying a 5... but if I can run LLMs off one or two... I just might. I guess what the next Raspberry Pi needs is a genuinely impressive GPU that COULD run small AI models, so people will start cracking at it.