> Some of it I can even regurgitate almost exactly

If you (or any human) violate copyright law, legal redress can be sought. The amount of damage you can do is limited because there's only one of you vs the marginal cost of duplicating AI instances.

There are many other differences between humans and AI in terms of capabilities and motivations to f the legal persons making decisions.

You may be right about the damage (will not dispute it even if I personally doubt it) - but what about the amount of good that it can do too? When deciding "what is to be done now" under uncertainty, we typically look at both sides of the ledger, the upsides in addition to the downsides.

Assume for a moment, that the current AI is teaching us that compute transforming data → information → knowledge → intelligence → agency → ... → AGI → ASI, is all there is to Intelligence-on-Tap? And imagine an AI path opens to AGI now and ASI later, where previously we didn't see any. Seems a bad deal to me, to frustrate, slow down, or even forego the 2050-s Intelligence Revolution that may multiply total human wealth by a factor of 10 to 20 in value, the way the Industrial Revolution did in the 1800-s. And we are to forego this, for what - so that we provide UBI to Disney shareholders? Every one of us is richer, better off now, than any king of old. Not too long ago, even the most powerful person in the lands could not prevent their 17 miscarriages/stillbirths/child_deaths failing to produce an heir to ascend the throne (a top priority that was, for sure for a king+queen). So in our imagined utopia, even the Disney shareholders are better off than they would be otherwise.

> Seems a bad deal to me, to frustrate, slow down, or even forego the 2050-s Intelligence Revolution that may multiply total human wealth by a factor of 10 to 20 in value...

Why do you assume the emergence of a super intelligence would result in human wealth increasing instead of decreasing? Looking at how humans with superior technology used it to exploit fellow humans throughout history should give you pause. Humans don't care about the aggregate "dog wealth" - let alone that of ants.

I'm assuming the Intelligence Revolution, multiplying Human Intelligence with machines, will have the same effect as the Industrial Revolution had, on multiplying human physical strength. That multiplied the GDP by a factor of ~20 times, hockey stick like, in a fairy short time, a century or two.

The industrial revolution was powered by natural resources that it helped unlock. What value reserve will ai tap into to create hockey stick growth?

It will recombine the existing resources in new ways. Neanderthals had access to exactly the same natural resources as we have now. Obviously we do much more with what we both got, then they ever did. Obviously it's not only the availability of some atoms or molecules, but what one does with them, how one recombines them in novel ways. For that one needs knowledge and energy. And the later mostly turns out can be derived from the the former too.

Obviously it's what we do with them, the biotech manufacturing and nuclear power production revolution happened pre AI. The reason it hasn't replaced petroleum is economic and social.

The amount of damage you can do is limited because there's only one of you vs the marginal cost of duplicating AI instances

But enough about whether it should be legal to own a Xerox machine. It's what you do with the machine that matters.

> It's what you do with the machine that matters.

The capabilities of a machine matter a lot under law. See current US gun legislation[1], or laws banning export of dual-use technology for examples of laws that have inherent capabilities - not just the use of the thing- as core considerations.

1. It's illegal to possess a new, automatic weapon with some grandfathering prior to 1986

While true, computers in general alreay had the ability to perfectly replicate data, hence blank media tax: https://en.wikipedia.org/wiki/Private_copying_levy

I think the reason for all the current confusion is that we previously had two very distince groups of "mind" and "mindless"*, and that led to a lot of freedom for everyone to learn a completly different separation hyperplane between the categories, and AI is now far enough into the middle that for some of us it's on one side and for others of us it's on the other.

* and various other pairs that are no longer synonyms but they used to be; so also "person" vs. "thing", though currently only very few actually think of AI as person-like

Yes, but gun control and dual-use export regulations are both stupid. We need fewer tool-blaming laws, not more.

(Standing by for the inevitable even-goofier analogy comparing AI with privately-owned nuclear arsenals...)