>Why does AGI necessitate having feelings or consciousness
No one knows if it does or not. We don't know why we are conscious and we have no test whatsoever to measure consciousness.
In fact the only reason we know that current AI has no consciousness is because "obviously it's not conscious."
Excel and Powerpoint are not conscious and so there is not reason to expect any other computation inside a digital computer to be different.
You may say something similar for matter and human minds, but we have a very limited and incomplete understanding of the brain and possibly even of the universe. Furthermore we do have a subjective experience of consciousness.
On the other hand we have a complete understanding of how LLM inference ultimately maps to matrix multiplications which map to discrete instructions and how those execute on hardware.
I know I have a subjective experience of consciousness.
I’m less sure about you. Simply claiming you do isn’t hard evidence of the fact; after all, LLMs do the same.
If there were evidence that one LLM would be conscious I would also accept it for others. However this is not the case.
But we know at least one human is conscious. That‘s convincing me.