What's your point?

The AI is being sold as an expert, not a student. These are categorically different things.

The mistake in the post is one that can be avoided by taking a single class at a community college. No PhD required, not even a B.S., not even an electricians certificate.

So I don't get your point. You're comparing a person in a learning environment to the equivalent of a person claiming to have a PhD in electrical engineering. A student letting the magic smoke escape from a basic circuit is a learnable experience (a memorable one that has high impact), especially when done in a learning environment where an expert can ensure more dangerous mistakes are less likely or non existent. But the same action from a PhD educated engineer would make you reasonably question their qualifications. Yes, humans make mistakes but if you follow the AI's instructions and light things on fire you get sued. If you follow the engineer's instructions and set things on fire then that engineer gets fired likely loses their license.

So what is your point?

No one thinks their breadboard wont catch on fire because an AI agent told them it wouldn’t. Its never been easier to learn because of these agents.

Lawyers are getting in trouble because they use AI and submit fabricated citations about fabricated cases as precedent. A bunch of charges were recently thrown out in Wisconsin because of this, and it's not the first time such behavior has made the news.

https://www.wpr.org/news/judge-sanctions-kenosha-county-da-a...

AI is indeed being understood to be an expert that replaces human judgement, and people are being hurt because of it.

That's a very strong claim. I don't think people expect their circuits to ignite, LLM instruction or not. But I'd expect learning from a book or dedicated website would be less likely for that to occur. (Even accounting for bad manufacturing)

You're biased because you're not considering that by definition the student is inexperienced. Unknown unknowns. Tons of people don't know very basic things (why would they?) like circuits with capacitors bring dangerous when the power is off.

Why are you defending there LLM? Would you be as nice to a person? I'd expect not because these threads tend to point out a person's idiocy. I'm not sure why we give greater leeway to the machine. I'm not sure why we forgive them as if they are a student learning but someone posting similar instructions on a blog gets (rightfully) thrashed. That blog writer is almost never claiming PhD expertise

I agree that LLMs can greatly aid in learning. But I also think they can greatly hinder learning. I'm not sure why anyone thinks it's any different than when people got access to the internet. We gave people access to all the information in the world and people "do their own research" and end up making egregious errors because they don't know how to research (naively think it's "searching for information"), what questions to ask, or how to interrogate data (and much more). Instead we've ended up with lots of conspiratorial thinking. Now a sycophantic search engine is going to fix that? I'm unconvinced. Mostly because we can observe the result.

In my experience people don’t use LLMs to learn but to circumvent learning.