For me GPS definitely reduced my ability to find my way and the same should be true for AI. I think it could lead to reduced critical thinking. Frequent use of ChatGPT for quick answers may reduce opportunities to practice critical thinking and problem-solving skills. Also passive learning might be an issue. Relying on AI for information without actively engaging with the material could lead to superficial understanding. Easy access to information might reduce the need to memorize facts, potentially affecting long-term retention. The same for writing and research. Generating text or conducting research might diminish one's ability to perform these tasks independently. On the other hand it's true for almost all technology so we might have more headspace for other problems.

Interesting comment about GPS. I find I learn a lot because I always check the route it chooses, and often find possibilities I didn't consider. I think GPS is different to AI because it calculates from a well defined set of parameters. AI on the other hand regurgitates other people's work without having any actual knowledge at all.

Pointing at tech and blaming it for degrading the human condition is refusing to take responsibility of your own predicament. (not saying anyone here is doing it - it's a generalization!)

People who get 'dumber' from technologies, like GPS mentioned by a commenter (nice example), but also auto-correct, intellisense and such helpertools, they choose themselves the path of least resistance, and least learning. Arguably, they are already dumb, as they choose themselves not to learn.

It's all ok, i would say there's no rule that forces people to learn or be smart. But if you do want to learn, avoid using technologies _constantly_ which hamper your learning.

In the end, be responsible for yourself, and don't blame external things from preventing you to do things while you choose yourself to engage with them. (if it's not by choice, it'd be a different case perhaps, but no one is forcing anyone to use AI....)

I’d take a slightly different perspective. I’d like to think that these tools are in some ways “humanizing” - we can offload things that we’re not particularly good at (like memorization tasks) and instead we can use our capacities to do things that (at least for now) we humans are uniquely capable of doing. As an example: Back in the 90’s people knew many phone numbers by heart, nowadays I don’t think people know more than a handful, if that. Does that mean that “phone contacts” are making us dumber? Or perhaps we can use the time/effort/capacity to better use.

I do agree that we might want to offload mundane and boring / repetitive tasks which do not add value to our lives. But this 'value' is a personal and subjective thing, so it's hard to give a recipe there which will fit everyone.

Hence I think its important for people to be educated in how to maintain this balance of easy-life vs. hard-life to optimise for their own life what they will get from it.

I for example do not use intellisense nor auto-correct. (i do make a lot of spelling mistakes!). I want to learn to program, so intellisense will break that learning. I do not want to 'produce programs' - for which intellisense is _super good_ as it will increase productivity a lot. I know most people make the tradeoff the other way, as they prefer productivity over learning.

> to better use

I'm still waiting for this to take effect.

> Pointing at tech and blaming it for degrading the human condition is refusing to take responsibility of your own predicament. (not saying anyone here is doing it - it's a generalization!)

So i take issue with this, but i also don't necessarily argue that i'm right. This is similar in my mind to free speech on podiums, and where we sit in general on responsibility of collective health, etc.

I personally don't think we're much beyond monkeys. Some of us are fortunate and aren't tempted by addictive behaviors, substances, etc. However i suspect most of us are.

Of course there's some argument that no amount of addictive substance (drug or otherwise) absolves your personal responsibility to not consume it, but with enough resources and legal right-of-way, how can we expect the average person - aka humanity on average - to resist well funded corporations pushing so many angles of it? Would we not be ensuring our own damnation?

The reality is far less addictive and destructive than i described above, i'm not trying to be hyperbolic about the reality of the situation. Nonetheless i do feel like most consumer relationships for the last 40 years has been more adversarial than not, and companies have only grown more talented and funded towards this test of will. It leaves me unsure how to proceed, for humanity.

Hey, I do agree and don't agree, for the same reasons perhaps. My words are a bit utopian I suppose, and I do realize things aren't black and white. People need to have the wits about them to know what impact stuff has on them as a person, and this is really difficult to do. in my mind, that's a problem of our world, but that's i guess an personal issue i have with it. it's not a requirement for people to be more conscious, though i'd wish they (and myself!!) were.

I think, people should be free to talk, even say nasty things, and others should have thick skin. But, i don't insult people, because i don't like to hurt people, even though i don't think they should be hurt by my opinion even if i give it. so you are right its a bit in the middle, where at one side it's a problem things impact people collectively, and on the other side people should not let things impact them like that.

They are free to choose, but the choices might not always be clear or known.

The complexity in this area is a reason why i'm a proponent of public education, more societal focus on it, etc.

I suspect a big part of personal responsibility is the assumption that the average person is intelligent and capable of making informed decisions. Something i suspect many aren't, these days. Which isn't to say that we're all uneducated morons, but rather that many of the areas i brought up are purposefully being kept dark, obscured, etc. Which makes informed decisions.. well, difficult.

Anyway, sorry for the tangent, just wanted to bring it up. Appreciate your time :)

"I hope we will make AI push us to think more, not less. And through more thinking, we will create new, original approaches or solutions to old but very important problems."

I want to be positive, but I am not as optimistic as the dear writer.

I fear that our ability of long-term critical thinking will suffer. We already see that happening in the age of dopamine-fueled, short-form content. The average attention span is becoming shorter but the ability to rationalize effectively does not become faster.

Adding AI into the mix, the next 50 years will be very interesting for the Gen Z world.

Education at degree level is a self learning process.

You read, you discover, you clarify, you expand, you think for yourself, you find your own way, you discover your own views, uncover your own theory on the topic you study. You own what you learn.

An integration of the accumulated knowledge.

Like the four stages of competence:

Unconscious incompetence

Conscious incompetence

Conscious competence

Unconscious competence

AI cannot replace this process.

I feel dumb already, nowhere to fall

Yes, in the same way that Socrates said the same thing about writing.

[deleted]

Using AI as a coach or mentor shapes your thinking process significantly because AI internal rules and censorship (which we don't even know) impact its answers, your outputs and, consequently,... your future O_o

Due to the high cost of therapy. I expect several people who use therapists for ideas about their situation would do well with sending a long form prompt to chat gpt and asking for therapeutic ideas. Certainly cheaper than 100-250 per session with a therapist that isn't making progress.

And how is this different from a human teacher?