I just had a conversation with free chatgpt about when a sports game started. Chatgpt got it hilariously wrong, like a time it couldn't possibly be given the other things I know about the game. I just didn't want to trawl through search results to find out, so thought AI could be a nice shortcut. Mistake, I guess. Then I tried to tell it how and why it was wrong, with further hilariously wrong attempts to respond from the AI. I couldn't help but give a few more pointless clarifying replies, even though I knew I would get nothing out of it and the AI would learn nothing. I seem to do this every month or so and then get frustrated with how useless it is and then swear off it for another month.
Did you ask it to search the Internet as a part of your request? It is still extremely imperfect, but that typically helps it get basic details correct. At least for me at any rate.
I just had a conversation with free chatgpt about when a sports game started. Chatgpt got it hilariously wrong, like a time it couldn't possibly be given the other things I know about the game. I just didn't want to trawl through search results to find out, so thought AI could be a nice shortcut. Mistake, I guess. Then I tried to tell it how and why it was wrong, with further hilariously wrong attempts to respond from the AI. I couldn't help but give a few more pointless clarifying replies, even though I knew I would get nothing out of it and the AI would learn nothing. I seem to do this every month or so and then get frustrated with how useless it is and then swear off it for another month.
Did you ask it to search the Internet as a part of your request? It is still extremely imperfect, but that typically helps it get basic details correct. At least for me at any rate.