I really think a small amount of education on what LLMs actually are (document completers) and how context works (like present it as a top-level UI element, complete with fork and rollback) would solve most of these issues.

Given how they work, it's really not surprising that if it sees the first half of a lovers' suicide pact, it'll successfully fill in the second half. A small amount of understanding of the underlying technology would do a lot to prevent laypeople from anthropomorphizing LLMs.

I get the impression that some of today's products are specifically designed to hide these details to provide a more convincing user experience. That's counterproductive.

"Fraudulent" is more apt. They have weaponized trust in these things to sell their services, and now ads.