If I can use this with a local LLM it could be useful.

In ollama is included default add the endpoint URL yourself

Yeah. This seems like an area where a “tiny” (2-4GB) local model would be more than sufficient to generate very high quality queries and schema answers to the vast majority of questions. To the point that it feels outright wasteful to pay a frontier model for it.