reducing the load on overworked teachers by using GPT to generate exercises, quizes and explanations for students is "endless worthless sludge"?

I have teachers in my family, their lives have been basically ruined by people using ChatGPT-4 to cheat on their assignments. They spend their weekend trying to workout if someone has "actually written" this or not.

So sorry, we're back to spam generator. Even if it's "good spam".

One potential fix, or at least a partial mitigation, could be to weight homework 50% and exams 50%, and if a student's exam grades differ from their homework grades by a significant amount (e.g. 2 standard deviations) then the lower grade gets 100% weight. It's a crude instrument, but it might do the job.

>their lives have been basically ruined

a bit dramatic. there has to be an adjustment of teaching/assessing, but nothing that would "ruin" anyone's life.

>So sorry, we're back to spam generator. Even if it's "good spam".

is it spam if it's useful and solves a problem? I don't agree it fits the definition any more.

Teachers are under immense pressure, GPT allows a teacher to generate extension questions for gifted students or differentiate for less capable students, all on the fly. It can create CBT material tailored to a class or even an individual student. It's an extremely useful tool for capable teachers.

is it spam if it's useful and solves a problem? I don't agree it fits the definition any more.

Who said generating an essay is useful sorry ? What problem does that solve?

Your comments come accross as overly optimistic and dismissive . Like you have something to gain personally and aren’t interested in listening to others feedback.

I'm developing tools to help teachers generate learning material, exercises and quizes tailored to student needs.

>Who said generating an essay is useful sorry ? What problem does that solve?

Useful learning materials aligned with curriculum outcomes, taking into account learner needs and current level of understanding is literally the bread and butter of teaching.

I think those kinds of resources are both useful and solve a very real problem.

>Your comments come accross as overly optimistic and dismissive . Like you have something to gain personally and aren’t interested in listening to others feedback.

Fair point. I do have something to gain here. I've given a number of example prompts that are extremely useful for a working teacher in my replies to this thread. I don't think I'm being overly optimistic here. I'm not talking vague hypotheticals here, the tools that I'm building are already showing great usefulness.

> a bit dramatic. there has to be an adjustment of teaching/assessing, but nothing that would "ruin" anyone's life.

If you don't have the power to just change your mind about what the entire curriculum and/or assessment context is, it can be a workload increase of dozens of hours per week or more. If you do have the power, and do want to change your entire curriculum, it's hundreds of hours one-time. "Lives basically ruined" is an exaggeration, but you're preposterously understating the negative impact.

> is it spam if it's useful and solves a problem?

Whether or not it's useful has nothing to do with whether or not it's spam. I'm not claiming that your product is spam -- I'll get back to that -- but your reply to the spam accusation is completely wrong.

As for your hypothesis, I've had interactions where it did a good job of generating alternative activities/exercises, and interactions where it strenuously and lengthily kept suggesting absolute garbage. There's already garbage on the internet, we don't need LLMs to generate more. But yes, I've had situations where I got a good suggestion or two or three, in a list of ten or twenty, and although that's kind of blech, it's still better than not having the good suggestions.

>Whether or not it's useful has nothing to do with whether or not it's spam.

I think it has a lot to do with it. I can't see how generating educational content for the purpose of enhancing student outcomes with content reviewed by expert teachers can fall under the category of spam.

>As for your hypothesis, I've had interactions where it did a good job of generating alternative activities/exercises, and interactions where it strenuously and lengthily kept suggesting absolute garbage.

I like to present concrete examples of what I would consider to be useful content for a k-12 teacher.

Here's a very quick example that I whipped up

https://chatgpt.com/share/ec0927bc-0407-478b-b8e5-47aabb52d2...

This would align with Year 9 Maths for the Australian Curriculum.

This is an extremely valuable tool for

- A graduate teacher struggling to keep up with creating resources for new classes

- An experienced teacher moving to a new subject area or year level

Bear in mind that the GPT output is not necessarily intended to be used verbatim. A qualified specialist teacher with often times 6 years of study (4 year undergrad + 2 yr Masters) is the expert in the room who presumably will review the output, adjust, elaborate etc.

As a launching pad for tailored content for a gifted student, or lower level, differentiated content for a struggling student the GPT response is absolutely phenomenal. Unbelievably good.

I've used Maths as an example, however it's also very good at giving topic overviews across the Australian Curriculum.

Here's one for: elements of poetry:structure and forms

https://chatgpt.com/share/979a33e5-0d2d-4213-af14-408385ed39...

Again, an amazing introduction to the topic (I can't remember the exact curriculum outcome it's aligned to) which gives the teacher a structured intro which can then be spun off into exercises, activities or deep dives into the sub topics.

> I've had situations where I got a good suggestion or two or three, in a list of ten or twenty

This is a result of poor prompting. I'm working with very structured, detailed curriculum documents and the output across subject areas is just unbelievably good.

This is all for a K-12 context.

There are countless existing, human-vetted, designed on special purpose, bodies of work full of material like the stuff your chatgpt just "created". Why not use those?

Also, each of your examples had at least one error, did you not see them?

>Also, each of your examples had at least one error, did you not see them?

I didn't could you point them out?

>There are countless existing, human-vetted, designed on special purpose, bodies of work full of material like the stuff your chatgpt just "created". Why not use those?

As a classroom teacher I can tell you that piecing together existing resources is hard work and sometimes impossible because resource A is in this text book (which might not be digital) and resource B is on that website and quiz C is on another site. Sometimes it's impossible or very difficult to put all these pieces together in a cohesive manner. GPT can do all that an more.

The point is not to replace all existing resources with GPT, this is all or nothing logic. It's another tool in the tool belt which can save time and provide new ways of doing things.

Why haven’t they just gone back to basics and force students to write out long essays on paper by hand and in class?

Also have teachers in my family. Most of the time is spent adjusting the syllabus schedule and guiding (orally) the stragglers. Exercises, quizes and explanations are routine enough that good teachers I know can generate them on the spot.

>Exercises, quizes and explanations are routine enough that good teachers I know can generate them on the spot.

Every year there are thousands of graduate teacher looking for tools to help them teach better.

>good teachers I know can generate them on the spot

Even the best teacher can't create an interactive multiple choice quiz with automatic marking, tailored to a specific class (or even a specific student) on the spot.

I've been teaching for 20+ years, I have a solid grasp of the pain points.

> Even the best teacher can't create an interactive multiple choice quiz with automatic marking, tailored to a specific class (or even a specific student) on the spot.

Neither can "AI" though, so what's the point here?

I'm creating tools on top of AI that can which is my point.

Can you post a question and answer example if it doesn’t violate NDA because I have very little faith this is good for students.

sure

here's an example of a question and explanation which aligns to Australian Curriculum elaboration AC9M9A01_E4 explaining why frac{3^4}{3^4}=1, and 3^{4-4}=3^0

https://chatgpt.com/share/89c26d4f-2d8f-4043-acd7-f1c2be48c2...

to further elaborate why 3^0=1 https://chatgpt.com/share/9ca34c7f-49df-40ba-a9ef-cd21286392...

This is a relatively high level explanation. With proper prompting (which, sorry I don't have on hand right now) the explanation can be tailored to the target year level (Year 9 in this case) with exercises, additional examples and a quiz to test knowledge.

This is just the first example I have on hand and is just barely scratching the surface of what can be done.

The tools I'm building are aligned to the Austrlian Curriculum and as someone with a lot of classroom experience I can tell you that this kind of tailored content, explanations, exercises etc are a literal godsend for teachers regardless of experience level.

Bear in mind that the teacher with a 4 year undergrad in their specialist area and a Masters in teaching can use these initial explanations as a launching pad for generating tailored content for their class and even tailored content for individual students (either higher or lower level depending on student needs). The reason I mention this is because there is a lot of hand-wringing about hallucinations. To which my response is:

- After spending a lot of effort vetting the correctness of responses for a K-12 context hallucinations are not an issue. The training corpus is so saturated with correct data that this is not an issue in practice.

- In the unlikely scenario of hallucination, the response is vetted by a trained teacher who can quickly edit and adjust responses to suit their needs

[dead]