I truly believe AI will change all of education for the better, but of course it can also hinder learning if used improperly. Those who want to genuinely learn will learn while those looking for shortcuts will cause more harm to themselves. I just did a show HN today about something semi related.
I made A deep research assistant for families. Children can ask questions to explain difficult concepts and for parents to ask how to deal with any parenting situation. For example a 4 year old may ask “why does the plate break when it falls?”
example output: https://www.studyturtle.com/ask/PJ24GoWQ-pizza-sibling-fight...
I think research and the ability to summarize are important skills and automating these skills away will have bad downstream effects. I see people on Twitter asking grok to summarize a paragraph so I don't think further cementing this idea that a tool will summarize for you is a good idea.
I'm conflicted about this. Custom tutoring available at all times and at mass sounds like a great thing, if done right. On the other hand, the research skill you mentioned is something that I worry about atrophying as well. Where before, we used to read through 3 or 4 slightly related articles or stackoverflow questions and do the transfer of related topics onto our specific problem ourselves, with a tutor it's all pre-chewed.
Then again, human 1:1 tutoring is the most effective way to learn, isn't it? In the end it'll probably end up being a balance of reading through texts yourself and still researching broadly so you get an idea about the context around whatever it is you're trying to do, and having a tutor available to walk you through if you don't get it?
Do you genuinely have any non-anecdotal reason to believe that AI will improve education, or is it just hope?
I ask because every serious study on using modern generative AI tools tends to conclude fairly immediate and measurable deleterious effects on cognitive ability.
> I ask because every serious study on using modern generative AI tools
There are a lot of studies, and I can't say I've read all of them, but the ones I have read, there hasn't been much focus on how the participants used the LLM to learn. My guess is that it has a lot of effect on the end results. Someone just asking for the answer and then thinking "Lets remember this" will have very different results than someone who does the Socratic method of learning together with a LLM, as just one example.
You know, that's a good point too. The studies I've read all focused on cognition after using an LLM to complete tasks for work or hobbies. I do wonder if there might be a different outcome with learning specifically.
Every technology can be good or bad to an individual depending on how they use it. It is up to the user to decide how they will use the tool. For people who are really looking to learn a topic and understand in detail, then I think it can really help them to grasp the concepts.