I found it very interesting that Altman et al were worried that AI will become supremely intelligent and China will make a supervirus or some AI drones or whatnot, but not a single person was worried about destroying all jobs because we wouldn't need humans any more.

Or maybe they were not so much "worried" but "hopeful" that they'd amass literally all the wealth in the world.

Altman is an advocate of Universal Basic Income, as far as I'm aware. That doesn't sound like he's not worried about massive job losses.

https://www.cbsnews.com/news/sam-altman-universal-basic-inco...

https://finance.yahoo.com/news/sam-altman-wants-universal-ex...

> Altman is an advocate of Universal Basic Income

So he says. And the way he proposed reaching that was with a scam cryptocurrency under his control which has rightfully been banned in several countries.

https://www.buzzfeednews.com/article/richardnieva/worldcoin-...

https://www.technologyreview.com/2022/04/06/1048981/worldcoi...

[deleted]

If there's one thing that's clear from the article, it's that he's a proponent of anything that will benefit him, even multiple conflicting things at the same time.

I also find that interesting.

And not intending to defend the motives of anyone involved, but I'm hoping we can not worry about literally all jobs being destroyed, and AI companies amassing all the wealth in the world.

Don't we need at least some humans working and earning to buy these AI services? Am I not being imaginative enough? Is it possible for the whole economy to consist just of AI selling services to each other?

I realise that even if AI destroys most jobs, or even just a lot of jobs, and amasses most wealth, or a lot of wealth, it would still be a terrible thing for humans. The word "all" could have just been hyperbole, and it is still a valid point. I just want to know people's thoughts on whether entire replacement is possible.

Why keep human consumers to buy your services when you could just amass all the wealth you desire, and have autonomous systems that can ensure your unassailable physical security? You would sit atop the most stratified dominance hierarchy ever achieved, and it would reduce other humans to mere pets or breeding stock. I don’t think normal humans would desire that kind of power, and I don’t believe LLMs will take us there, but I wouldn’t put it past the perverted billionaire maniac.

Do you need ants buying services from humans for the world economy to function?

If AI will indeed become superintelligent, we won't matter.

I think fundamentally, the concern is misplaced. The fact you need to work for wealth is a convention of our constraints. The change in constraints would lead to other means of distribution. It's easy to see if someone who believes more productivity is good would not see making jobs obsolete a real problem. Thew would see us adapting to the new conditions in a relatively short while.

> The fact you need to work for wealth is a convention of our constraints

The current constraint is "you need to produce to have things".

If one company's AI takes all the jobs, and thus does all the producing-to-have-things, the constraint transforms into "you need that company's permission to have things".

Hence the top-level question.

The new conditions almost surely being like the old conditions: slavery, sexual exploitation, etc.

Those who are concerned is implying that any new distribution mechanism is not going to favour them.

And under the capitalist system, if nothing changes, the "new" distribution system is indeed not going to favour them - at best there would be some sort of UBI, and at worst you would be left to starve in the streets.

However, i cannot see how one can transition to a new system, and yet have the existing powers in the current system agree and not be disadvantaged.