Yeah, we just added support for local models. As I mentioned in an earlier comment, if you have a local model with an OpenAI-compatible v1/chat/completions endpoint (most local models have this option), you can route Erdos to use it in the Erdos AI settings.