>LLMs can be wonderful teachers

Are they or aren't they

Mostly, no. They will explain things to you and you'll feel like you understand them. When you have to do it, though, you'll find you're not any better off than when you started.

I used to see this with students in calculus who abused the tutoring resources. They'd have tutors just work problems (often their homework...) in front of them. "Ah! Obviously that trig substitution integral worked that way. Oh, of course, that proof is very obvious in retrospect." And then they'd walk away from the exam with a 30% and no idea how their 20 hours of "study" for it didn't result in the same performance as their peers who worked problems, read the materials and asked questions, etc., got.

Most AI use is that same in my experience. "Show me how the fundamental theory of calculus works." The LLM puts together a very elaborate and flashy presentation that they skim. Great. That's no different than reading a text book. Even if you ask the LLM questions and have it elaborate on things, you've never once done one of the most important things a student can do: spend time confused trying to work hard at understanding something that's not obvious. The LLM will make it obvious at every point. Total lack of friction. Works about as well as a spotter who does the lifting for you.

As usual it depends. When it does well it's because it can do well. When it does poorly it's because you're prompting it wrong.

>When it does well it's because it can do well.

Can't argue with that logic

hammers are both a great tool and a deadly weapon at once

Not at once, surely

limp response brah, both possibilities remain plausible until one crystallizes at the moment of observation

A million times better than any human teacher I’ve ever had, for sure.

Now I’m certain that there exist those mythical human instructors who can do better, but that’s not worth much if 99.99% of people don’t have access to them. Just like a good human physician who takes their time with the patient is better than an LLM, but that’s not worth much either given that this doesn’t match most people’s experience with their own physicians.

Did an LLM teach you a topic you did not feel like learning?

For me the best human teachers were the ones that managed to make me interested on topics that I thought are boring/useless (many times my opinion being stupid, mostly due to lack of experience).

So far with LLM I learn about things I know something (at least that they exist) and I am interested in, which is a small subset of things that one should learn during lifetime.

Well I have some evidence to support your hypothesis. During Covid my kids were at home, eventually with some kind of self learning website from school. I was upstairs working, checking in with progress on the parents app. Finish your daily school work and then you can game.

The kids learnt all about Team Fortress 2, Roblox, Rainbow Six etc. They also learnt how to game the learning system so it looked like they were doing their work.

Post college, are you hiring random teachers you make you excited about random topics or something?

[deleted]

Good point well made.

>A million times better than any human teacher I’ve ever had, for sure.

Not really, not if you want to ask it deep questions. It won't have an answer that is deeper than something that you can find online, and if pressed it will just keep circling around the same response.

The reason is that this "thing" was never curious, never asked questions, and never really learned anything. It just has learned the Internet "by heart", and is as boring as a human teacher who is not really curious about the subject they are teaching, and has just got some degree by "by hearting" some text book. Of course it does it much better than a human, but it is fundamentally the same thing.

>Now I’m certain that there exist those mythical human instructors who can do better,

You're certain that mythical instructors exist (?) who "can" do better?

Are human instructors more competent as teachers than AI teachers, or are AI teachers more competent as teachers than human teachers? No "this or that can happen," just a definitive statement please.

AI is likely a million times better student than my dimwit cybersec meatbags...er, majors, for sure, as well! Don't have a reliable way to measure or experience why/how, tho, so I'm not out here claiming it. Even if I did, why would I argue for their replacement?

They can be incredible. One on one teaching with an infinitely patient teacher who can generate interactive problems on the fly, for dollars a month? Wild. A year of paid ChatGPT would pay for about 9 hours of cheap tutoring here.

That's not going to work out the way you think it will when a student won't even know how to ask questions.