Maybe I got this wrong but I thought ambiguity refered to the input. So in a deterministic system I would assume that a input of "Give an example of a dice roll" Will always output the exact same example (unless the model also gets the context of the message history).
Ambiguity is what happens when you change the prompt slightly, e.g. by adding a word: "Give an example of a single dice roll". Now as a human our expectation would be that this is the same question and should thus (in a deterministic system) receive the same answer. But to an LLM it may not be.
> Ambiguity: [...] Different or non-deterministic models will return highly variance results.
Yes, and thanks. That was my intended point - but you point out a better example. Slightly different prompts may also produce highly varied responses.
(My subsequent comments on ambiguous models was in case I was misinterpreting the comment I was replying to. I also generally think of ambiguity as a property of input. Either way, ambiguity is not the same as non-deterministic.)