Like any human would, 75% certain with 99% confidence. That’s what you fail to realize. They aren’t “god mode machine”. They are “human-mode” machines and humans make mistakes in thinking just like you do. Some might say asking a powerful LLM for gaming tips is a waste of compute power. Others might say it gives you the knowledge of a new meta emerging. Either way, you both are going to get trained.