Already on Openrouter. Pro version is $1.74/m/input, $3.48/m/output, while flash $0.14/m/input, 0.28/m/output.

The Pro model is giving 429 Overload errors

Yup, can't really be used in production atm.

Getting 'Api Error' here :( Every other model is working fine.

Try interacting with it through the website, it will give an error and some explanation on the issue. I had to relax my guardrail settings.

https://openrouter.ai/deepseek/deepseek-v4-pro

https://openrouter.ai/deepseek/deepseek-v4-flash

Its on OR - but currently not available on their anthropic endpoint. OR if you read this, pls enable it there! I am using kimi-2.6 with Claude Code, works well, but Deepseek V4 gives an error:

`https://openrouter.ai/api/messages with model=deepseek/deepseek-v4-pro, OR returns an error because their Anthropic-compat translator doesn't cover V4 yet. The Claude CLI dutifully surfaces that error as "model...does not exist"