[flagged]

If I wanted to know what an LLM thought (I don't) I would go ask an LLM.

One person sharing what an LLM thinks is probably better for the environment than each person asking...

I'll trade a little bit of damage to the environment in exchange for keeping meaningful communication between human beings alive and well.

AI told us we should add glue to pizza

In this economy, it may be sage advice.

Is that not what tomato paste and/or cheese is? Food glue? The other ingredients would fall off too easily otherwise.

Or did the AI say we should be using PVA/cyanoacrylate/polyurethane glue or something?

You should stop using 3.5

I'm pretty sure that's an old trick based on some of the so-called cheese I've had on pizza

[flagged]

Is what the AI told him incorrect?

It sure would be great if the LLM in question would cite its sources so that we could verify whatever source it ingested this text from.

[flagged]

Indeed, I've seen LLMs begin to cite their sources, which is a commendable advancement and something that I've been asking for since the beginning of this craze: LLMs as librarians, not as summarizers. But if the commenter had a reputable data source, they should have quoted it and linked it.

[deleted]

I don't know, I didn't ask my AI

This is not the point

What is?