I'm not deeply familiar with Mastra, but reading their docs, it looks like they use the Vercel AI SDK — which is great, since Vercel's AI SDK can work with any OpenAI-compatible API, including ours. All you need to do is set a custom API base URL; in our case, that's https://api.synthetic.new/v1
Then just plug in your Synthetic API key, and you should be able to use any supported model. For example, to use GLM-4.5, you'd pass the following model string: "hf:zai-org/GLM-4.5"
I'm not deeply familiar with Mastra, but reading their docs, it looks like they use the Vercel AI SDK — which is great, since Vercel's AI SDK can work with any OpenAI-compatible API, including ours. All you need to do is set a custom API base URL; in our case, that's https://api.synthetic.new/v1
Then just plug in your Synthetic API key, and you should be able to use any supported model. For example, to use GLM-4.5, you'd pass the following model string: "hf:zai-org/GLM-4.5"
The AI SDK docs are here for using custom base URLs: https://ai-sdk.dev/docs/ai-sdk-core/provider-management
You can also join our Discord if you need help! https://synthetic.new/discord should redirect you to our Discord server :)