ChatGPT hallucinates things all the time. I will feed it info on something and have a conversation. At first it's mostly fine, but eventually it starts just making stuff up.

I've found that giving it occasional nudges (like reminding it of the original premise) can help keep it on track

Ah yes it is a fantastic tool when you manually correct it all the time.