> And it will make them up just like it does everything else. You can’t trust those either.

In tech-related matters such as coding, I've come to expect every link ChatGPT provides as reference/documentation is simply wrong or nonexistent. I can count with fingers from a single hand the times I clicked on a link to a doc from ChatGPT that didn't result in a 404.

I've had better luck with links to products from Amazon or eBay (or my local equivalent e-shop). But for tech documentation which is freely available online? ChatGPT just makes shit up.