"Are you an LLM?" poof, fails the Turing test.

Even if they lie, you could ask them 20 times and they d reply the lie, without feeling annoyed: FAIL.

LLMs cannot pass the Turing test, it's easy to see they're not human. They always enjoy questions ! And they never ask any !

You're trained to look for LLM-like output. My 70 year old mother is not. She thought cabbage tractor was real until I broke the news to her. It's not her fault either.

The turning test wasn't meant to be bulletproof, or even quantifiable. It was a thought experiment.