You are right about the motivation behind the glee but it actually has a kernel of truth in it: With making such elementary mistakes, this thing isn't going to be autonomous anytime soon.

Such elementary mistakes can be made by humans under influence of a substance or with some mental issues. It's pretty much the kind of people you wouldn't trust with a vehicle or anything important.

IMHO all entry level clerical jobs and coding as a profession is done but these elementary mistakes imply that people with jobs that require agency will be fine. Any non-entry level jobs have huge component of trust in it.

I think the 'elementary mistakes' in humans are far more common than confined to the mentally ill or intoxicated. There are entire shows/YT channels dedicated to grabbing a random person on the street and asking them a series of simple questions.

Often, these questions are pure-fact (who is the current US Vice President), but for some, the idea is that a young child can answer the questions better than an 'average' adult. These questions often play on the assumptions an adult might make that lead them astray, whereas a child/pre-teen answers the question correctly by having different assumptions or not assuming.

Presumably, even some of the worst (poorest performance) contestants in these shows (i.e. the ones selected for to provide humor for audiences) have jobs that require agency. I think it's more likely that most jobs/tasks either have extensive rules (and/or refer to rules defined elsewhere like in the legal system) or they have allowances for human error and ambiguity.

The LLM is probably also not going to launch into a rant about how they incorporate religious and racial beliefs into their life when asked about current heads of state. You ask the LLM about a solar configuration, and I think it must be exceptionally rare to have it instead tell you about its feelings on politics.

We had a big winter storm a few weeks ago, right when I received a large solar panel to review. I sent my grandpa a picture of the solar panel on its ground mount, covered in snow, noting I just got it today and it wasn't working well (he's very MAGA-y, so I figured the joke would land well). I received a straight-faced reply on how PV panels work, noting they require direct sunlight and that direct sunlight through heavy snow doesn't count; they don't tell you this when they sell these things, he says. I decided to chalk this up to being out-deadpanned and did not reply "thanks, ChatGPT."

I'm pretty sure %100 of those people would have the correct answers when they are focused and have access to the internet and studied the entire corpus of human knowledge.

In the case of the issue at hand though, it is not a knowledge question it is a logic question. No human will go to the carwash without the car unless they are intoxicated or are having something some issue preventing them from thinking clearly.

IMHO all that can be solved when AI actually start acting in place of human though. At this time "AI" is just an LLM that outputs something based on some single input but a human mind operates in a different environment than that.