The LiteLLM SDK is intentionally on the website. You can "talk" to GoModel with it because both projects use an OpenAI-compatible API under the hood.

You can use it like this:

  from litellm import completion
  print(completion(
      model="openai/gpt-4.1-nano",
      api_base="http://localhost:8080/v1",
      api_key="your-gomodel-key",
      messages=[{"role": "user", "content": "hi"}],
  ).choices[0].message.content)

Thank you