? Isn't lmstudio API openai compatible? Codex cli already supports 3rd party models, you have to edit the config yaml file, and you can add many model providers.
? Isn't lmstudio API openai compatible? Codex cli already supports 3rd party models, you have to edit the config yaml file, and you can add many model providers.
I never managed to get it to work. I used chatgpt to try to do it for me :)
This is how I did it - https://github.com/openai/codex/blob/main/docs/config.md#mod...
I have it go to openrouter, and then you just export the API key and run codex, works smoothly.