Do you have the option to run on a local model? Lots of firms don't want data or prompts going outside the local network
Yep — if you have a local model with an OpenAI-compatible v1/chat/completions endpoint (most local models have this option), you can route Erdos to use it in the Erdos AI settings.
Yep — if you have a local model with an OpenAI-compatible v1/chat/completions endpoint (most local models have this option), you can route Erdos to use it in the Erdos AI settings.