It's starting to look more and more to me as if conscious is just an illusion that we ourselves perceive. There is nothing fundamental about it, just an artefact of a certain style of computing as perceived by the reasoner itself.
We look at the current llms and because we see them for how they are fundamentally operating we assume they can't be "conscious" but we really don't even know what conscious is. The only people in the world that know ANYTHING about conscious are anaesthesiologist - they know how to turn it off and on again. What does that even tell you about conscious?
We don't really have a good way to measure whether something has consciousness. Heck, we have pretty limited ways of testing how "intelligent" non-human animals are (e.g. https://en.wikipedia.org/wiki/Theory_of_mind_in_animals).
With that said, just because we don't have a great way of measuring it doesn't mean that we should assume LLMs are intelligent. An LLM is code and a massive collection of training weights. It has no means of observing and reasoning about the world, doesn't store memories the same way that organic brains do (and is in fact quite limited in this aspect). It currently isn't able to solve a problem it hasn't encountered in its training data, or produce novel research on a topic without significant handholding. Furthermore, the frequent errors made by it suggests that it fundamentally does not understand the words that it spits out.
Not really sure what you mean by your anesthesiology comment. Being able to intubate and inject propofol does not make you more of an expert on consciousness than neuroscientists and neurologists.
I didn't say we should assume LLMs are intelligent. In fact I always thought they weren't because they only "forward pass".
But then they came up with the whole "Reasoning model" paradigm and that contains obvious feedback loops. So now just throw my hands in the air because I think no one really knows or can tell for sure. We are all clueless here.
I can really recommend this book by Douglas Hofstadter: https://en.wikipedia.org/wiki/I_Am_a_Strange_Loop
IMHO consciousness is just the ability to detect change. Everything can be calm and static, and then, suddenly, something changed. I think that is our capacity to notice that change that makes us conscious.
It's literally the only thing you can be certain of, your own conciseness.
You can only be certain you perceive it and you can't be certain others perceive it (or if others exist at all of course).
The only thing you can really tell is "I perceive myself in some sort of feedback loop manner". Which to me it even sounds like it has "arisen" from underlying mechanisms.
We can't even tell about the feedback loop. LLMs shows why: we have no way of telling if our active memory is true or if the present moment I'd the only thing that have ever existed for us.