LLMs don't "think" or "understand" in any way. They aren't AGI. They're still just stochastic parrots.
Putting them in control of making decisions without humans in the loop is still pretty crazy.
LLMs don't "think" or "understand" in any way. They aren't AGI. They're still just stochastic parrots.
Putting them in control of making decisions without humans in the loop is still pretty crazy.