LLMs are guessing machines, they don’t “decide” anything. It would be decided by the people programming it and putting in alignment guardrails.
LLMs are guessing machines, they don’t “decide” anything. It would be decided by the people programming it and putting in alignment guardrails.