Mostly, no. They will explain things to you and you'll feel like you understand them. When you have to do it, though, you'll find you're not any better off than when you started.

I used to see this with students in calculus who abused the tutoring resources. They'd have tutors just work problems (often their homework...) in front of them. "Ah! Obviously that trig substitution integral worked that way. Oh, of course, that proof is very obvious in retrospect." And then they'd walk away from the exam with a 30% and no idea how their 20 hours of "study" for it didn't result in the same performance as their peers who worked problems, read the materials and asked questions, etc., got.

Most AI use is that same in my experience. "Show me how the fundamental theory of calculus works." The LLM puts together a very elaborate and flashy presentation that they skim. Great. That's no different than reading a text book. Even if you ask the LLM questions and have it elaborate on things, you've never once done one of the most important things a student can do: spend time confused trying to work hard at understanding something that's not obvious. The LLM will make it obvious at every point. Total lack of friction. Works about as well as a spotter who does the lifting for you.