How do you configure LLM température in coding agents, e.g. opencode?

https://opencode.ai/docs/agents/#temperature

set it in your opencode.json

Note when I said "you have to hack it in", I mean you'll need to hack in support for modern LLM samplers like min_p, which enables setting temperature up to infinity (given min_p approaching 1) while maintaining coherence.

You can't without hacking it! That's my point! The only places you can easily are via the API directly, or "coomer" frontends like SillyTavern, Oobabooga, etc.

Same problem with image generation (lack of support for different SDE solvers, the image version of LLM sampling) but they have different "coomer" tools, i.e. ComfyUI or Automatic1111

Once again, porn is where the innovation is…

Please.. "Creative Writing"