So this doesn't happen in the paid plans of ChatGPT? But why?

Paid plans give you access to much larger, more intelligent models which have thinking enabled (inference time compute). In the example here you can see GPT Pro taking 20-80 minutes to respond with the proof.

All this is far more expensive to serve so it’s locked away behind paid plans.

> thinking enabled (inference time compute)

What do you mean by compute?

I would google or use ChatGPT to a learn more about this, free version should be totally sufficient.