"copyright infringement factories"

Tells you right away where this is coming from.

Do you mean something specific, because that sounds like a criticism but with some blanks that need to be filled in.

If you just mean they come across as annoyed by AI, that's true, but that's also way too wide of a category to infer basically anything else about them.

The critisism is valid. The problem is how you value this critism.

I agree they are stealing it but I also see the benefit of it for society and for myself.

Suckerberg downloaded terabytes of books for training, while people around me got sued to hell 20 years ago for downloading one mp3 file.

they got sued for uploading actually

and Zuck isn’t sued for downloading either, he is sued for reproduction by the AI not being derivative enough, but so far all branches of government support that

Anna's Archive. Aaron Swartz.

FB and so are CIA fronts and they can do anything they please. Until they hit against Disney and lobbying giants and if a CIA idiot tries to sue/bribe/blackmail them they can order Hollywood to rot their images into pieces with all the wars they promoted in Middle East and Latin America just to fill the wallets of CEO's. That among some social critique movie on FB about getting illegal user data all over the world to deny insurances and whatnot. And OFC with a clear mention of the Epstein case with related people, just in case the Americans forgot about it.

Then the US industry and military complex would collapse in months with brainwashed kids running away from the army. Not to mention to the Call of Duty franchise and the like. It would be the end of Boeing and several more, of course. To hell to profit driven wars for nothing.

Ah, yes, AIPAC lobbies and the like. Good luck taming right wing wackos hating the MAGA cult more than the 'woke' people themselves. These will be the first ones against you after sinking the US image for decades, even more than the illegal Iraq war with no WMD's and the Bush/Cheney mafia.

The outcome of this? proper and serious engineering a la Airbus. Instant profit-driven MBA and war sickos being kicked out from the spot. OFC the AI snakeoil sellers except for the classical AI/NN against concrete cases (image detection and the like), these will survive fine, even better because these kind of jobs are highly specific and they are not statistical text parrots. They can provide granted results unlike LLM's prone to degrade because the human based content feeding needs to be continuous, while for tumour detection a big enough sample can cover a 99% of the cases.

R&D on electric vehicles/energy and nuclear power like nowhere else. And, for sure, the EV equivalent of a Ford T for Americans. A cheap and reliable one, good enough for the common Joe/Mary without being a luxury item. A new Golden Age would rise, for sure. But the oil mafia will try to fight them like crazy.

I don't know how anyone can call the most amazing invention in computer science of the last 20 years "copyright infringement factories". We went from the ST:NG ship computer being futuristic tech to "we kinda have this now". Its like calling cars "air pollution factories", as if that was their only purpose and use.

A fundamentally anti-civilisational mindset.

You can see both sides, critzise how its done and still wanting to have the result of it.

Its a little bit hypocritic which often enough ends in realism aka "okay we clearly can't fight their copyright infridgments because they are too powerful and too rich but at least we can use the good side of it".

Nothing btw. enforces all of this to happen THAT fast besides capitalism. We could slow down, we could do it better or more right.

I'm sorry, but you're acting obtuse if you pretend you don't know why they're being called that.

LLMs are amazing technology. It's crazy to interact with something that knows a lot about effectively everything that's ever been written, as well as mimicking human cognition to a large degree.

What LLMs are NOT is intelligent in the same way as a human, which is to say they are not "AGI". They may be loosely AGI-equivalent for certain tasks, software development being the poster child. LLMs have no equivalent of "judgement", and they lie ("hallucinate") with impunity if they don't know the answer. Even with coding, they'll often do the wrong thing, such as writing tests that don't test anything.

It seems likely that LLMs will be one component of a truly conscious AI (AGI+), in the same way our subconscious facility to form sentences is part of our intelligence. We'll see how quickly the other pieces arrive, if ever.

The people pushing this technology, that accelerates climate change, have lobbied the government to circumvent typical roadblocks created by society to limit sensationalist development. Incidentally, the same people who talk about how dangerous AI will be for society, but don't worry, they're going to be the one to deliver it safely.

Now, I don't believe AI will ever amount to enough to be a critical threat to human life, you know, beyond the immense amounts of wasted energy they propose to convert into something more useful, like a market crash or heat and noise, or both.

Not sure how you can call someone opposed to any of that "anti-civilisational" matter-of-factly.