We are uniquely capable of doing that because we invented that :) It’s a self-serving definition, a job description.

This isn’t an argument against LLMs capability. But the burden of proof is on the LLMs’ side.

True. That capability might be reserved for AGI. The current implementation does feel like a party trick and I don't enjoy working with it