You can use this with openwebui already. Just llm install llm-model-gateway. Then after you save a consortium you run llm serve --host 0.0.0.0 This will give you a openai compatible endpoint which you add to your chat client.