I rank with those who think human-like intelligence will require embeddings grounded in multiple physical sensory domains (vision, touch, audio, chemical sensing, etc.) fused into a shared world representation. That seems much closer to how biological intelligence works than text-only models. But if this path succeeds and produces systems with something like genuine understanding or sentience, there’s a deeper question: what is the moral status of such systems? If they have experiences or agency, treating them purely as tools could start to look uncomfortably close to slavery.
Its interesting that you seem to be more concerned that we would potentially enslave human like robots (while arguing sentience) while the likelihood of events is that we are far more likely to be enslaved to/by our own creations.
Id say probability wise we don’t create sentient like behavior for a long time (low probability) much higher is the second circumstance.
Personal Agency is a strong characteristic of a personality. AI would have to acquire a personality first. It could probably do this by copying others statistically. In that case, it is only doing what someone else has done.
There is no such thing as real sentient AI theoretically. Our current models are only emulations of humans. Maybe in the future someone will figure out a way for computers to learn how to learn. Then maybe someone will codify computers to acquire base methodologies vs just implementing any methodology it finds in the world.
It's an interesting question. On one hand we don't worry about this much with animals, the most advanced of which we know have personalities, moods, etc (Pigs, for instance). They really only seem to lack the language and higher-order reasoning skills. But where's the line?
We do worry much more about animal well-being than we worry about our "lumps of metal" (as a cousin comment fittingly put it). As we should, and generally I think we should worry much more about animal welfare. I find concerns for AI system welfare voiced by people like Thomas Metzinger wildly misguided.
And while they don't have language like we do, dogs can understand basic commands and they aren't even the smartest animals.
I don't think they will have sentience or agency unless they are designed to:
1) Keep thinking continuously, as opposed to current AIs that stop functioning between prompts. 2) Have permanent memory of their previous experiences. 3) Be able to alter their own weights based on those experiences (a.k.a. learn).
That's the direction the field is already going with "agents". People want autonomous AI agents that are capable of acting independently and that have more and more capabilities. For example, something like Claude code, but that acts as a sidekick that is constantly running, and able to act without being prompted. That's what people are imagining when they talk about teams of agents. You act as a manager, but your coding agents are off working on various features and only check in periodically.
They won't have sentience because it will be antithetical to capitalist business ideology. There's no good business value proposition for having the AI daydream like humans do, or 'sleep' while 'on', or have inspirational thought that might be seen as 'wrong' or useless. If that behavior ever manifests, it will probably be stamped out in a future release.
You can't justify to the board the wasted money to have the android dream.
Does anyone else see an echo of Severance (Apple TV series) here?
What’s the difference from thinking your brain is a slave to your body or vice versa?
We only think slavery is bad because have a philosophy and language to describe and evaluate the situation. It’s unlikely Ant colonies understand the concept of slavery, eunuchs, or feminism. We have the framework to understand these concepts without them we’d be oblivious to them.
Lol. A lump of metal can't be sentient.
Yeah, call me when Yann incorporates the four humors and the elemental force of fire, from which we draw life. Metal lacks the nature for this purpose.
Says the bag of lipids and proteins :)
Carbon, Hydrogen, Oxygen, Nitrogen, Phosphorus, Sulfur and a dash of other elements.
$99.85 at Sigma-Aldrich
Mostly water, actually.
Typical. You know they pump the chickens at the grocery store too.
Lol. A lump of flesh can't be sentient
https://www.mit.edu/people/dpolicar/writing/prose/text/think...
I think the more likely retort will be that we can't be smart, by the AI's standard.