does this work with purely local models through Ollama, or do you still need the Ollama server running on another machine? been looking for something that actually works offline for basic voice commands

Still needs a server. You could run a server locally if you had a model that your device could handle then point aide to the localhost URL.