And you don’t believe that there’s ever going to be a time in any future ever, when a group of machines is going to autonomously challenge or coerce an individual human or group of humans?
And you don’t believe that there’s ever going to be a time in any future ever, when a group of machines is going to autonomously challenge or coerce an individual human or group of humans?
It's a machine. It by definition lacks autonomy.
The act may be circuiticiously arrived at, but still. Somebody has to write and run the program.
That kind of dodges my question.
I’ll repeat it: Is there any time in the future where you believe a machine or set of machines could measurably out perform a human to the degree that they can coerce or overpower them with no human intervention?
(Ya sure, because repeating yourself is always so helpful)
well, leaving the "with no human intervention" part, which is a bit fuzzy.
Ya sure. AI can already contrive erudite bs arguments at a moment's notice, sell stuff pretty good and shoot guns with great accuracy.
Do you?
Yes I do
So, given that we agree that there will be superhuman robotic systems; would you disagree that such a system, at scale, would be impossible to overcome for human or group of humans?
Ya don't say.
Just state your big hypothesis already.