Sure; I just mean relative to the degree of plausibility LLMs typically provide with technical explanations. They're often wrong there too, but the difference in plausibility in these scenarios is something I found interesting.
Sure; I just mean relative to the degree of plausibility LLMs typically provide with technical explanations. They're often wrong there too, but the difference in plausibility in these scenarios is something I found interesting.