[flagged]

"Eschew flamebait. Avoid generic tangents."

https://news.ycombinator.com/newsguidelines.html

Okay. But then you could say the same for a human, isn't your brain just a cloud of matter and electricity that just reacts to senses deterministically?

> isn't your brain just a cloud of matter and electricity that just reacts to senses deterministically?

LLMs are not deterministic.

I'd argue over the short term humans are more deterministic. I ask a human the same question multiple times and I get the same answer. I ask an LLM and each answer could be very different depending on its "temperature".

If you ask human the same question repeatedly, you'll get different answers. I think that at third you'll get "I already answered that" etc.

We hardly react to things deterministically.

But I agree with the sentiment. It seems it is more important than ever to agree on what it means to understand something.

[deleted]

I'm having a bad day today. I'm 100% certain that today I'll react completely different to any tiny issue compared to how I did yesterday.

Right, if you change the input to your function, you get a different output. By that logic, the function `(def (add a b) (+ a b)` isn't deterministic.

I mean - try clicking the CoPilot button and see what it can actually do. Last I checked, it told me it couldn't change any of the actual data itself, but it could give you suggestions. Low bar for excellence here.

OK then. Groks?