Organizations are choosing to eliminate workers rather than amplify them with AI because they'd rather own 100% of diminished capacity than share proceeds from exponentially increased capacity. That's the rent extraction model consuming its own productive infrastructure. The Stanford study documents organizations systematically choosing inferior economic strategies because their rent-extraction frameworks cannot conceptualize workers as productive assets to amplify. This reveals that these organizations are economic rent-seekers that happen to have productive workers, not production companies that happen to extract rents. When forced to choose between preserving rent extraction structures or maximizing value creation, they preserve extraction even at the cost of destroying productive capacity. So what comes next?
> So what comes next?
When you don’t need as many people because of automation, you also don’t need them to fight your wars. You use drones and other automated weapons. You don’t need things like democracy because that was to prevent people from turning to revolution, and that problem has been solved with automated weapons. So then you don’t really need as many people anymore, so you stop providing the expensive healthcare, food production, and water to keep them all alive
Yeah this is what we are seeing today, also its not just junior jobs going, according to Amazon they are using it to get rid of expensive senior employees while they are actually holding onto juniors using ai tools.
We have seen a lot of use of h1b and outsourcing despite the massive job shortage. Seeing lots of fake job sites filled with ai generated fake openings and paid membership for access to "premium jobs."
They're using ICE to effectively pay half the country to murder the other half, but the ICE budget is limited so that automated systems can then gun down the ICE community to replace 99.9% of humans with machines.
Ultimately this is great for Russia because they'll still be able to invade even if they have only 300 soldiers left in their military, after they hit a low orbit nuke blast to shutdown the Ai US, basically only Melania swinging her purse at the troops will be one of the few left alive to resist.
> Ultimately this is great for Russia
Wat
Dark. But I can't think of a way to rebuke it…
There's likely a slippery slope fallacy in there somewhere (I hope). If interested in the (not so) sci-fi aspects of automated weapons and their ramifications, I often plug Daniel Suarez's great Kill Decision talk and book: https://www.youtube.com/watch?v=pMYYx_im5QI
Its deterministic and assumes that those in power are one uniform force. Its still possible to push for a different future.
> When you don’t need as many people because of automation
You want to sell your stuff to someone, tho. So, unless you find a way to automate consumption as well, you do need people and lots of it.
This works even better with a declining fertility rate!
The current wave of automation (LLMs) aren't capable of "fighting your wars".
Ukraine and russia are already employing low-cost recon and hunter-killer drones
israel is already using sniper drones in Palestine that use AI to fly around and headshot whatever moves, as well as AI to select their bombing targets.
the future is now, isn't it exciting?
Why does Mr Beast dig wells in Africa?
To launder his reputation? To distract that the source of his wealth is selling gambling and sugar to children? To feel better about himself?
palantir and anduril :))
Your claim is not supported by the paper:
"Furthermore, employment declines are concentrated in occupations where AI is more likely to automate, rather than augment, human labor."
No mention of rent-seeking.
No evidence they are being economically short-sighted.
> they'd rather own 100% of diminished capacity than share proceeds from exponentially increased capacity
They're using cheap AI to replace more expensive humans. There's no reason to think they are missing some exponential expansion opportunity that keeping those humans would achieve, and every reason to think otherwise.
Probably because there are no free markets anymore, it's all monopoly, cartel, and/or regulatory capture.
Competition would fix a whole lot of problems.
I hope AI fuels a re-independence of many industries by making business software discovery and integration cheap and easy, every plumber with more than 10 years experience should own their company with low cost software running it, the efficiency gains from consolidating resources a la private equity for marketing and book keeping go away in an AI powered world
> So what comes next?
Feudalism.
That's optimistic.
Ancient egypt (elite in pyramids, slaves otherwise) is more likely.
>> Feudalism.
> That's optimistic.
> Ancient egypt (elite in pyramids, slaves otherwise) is more likely.
No you're both being optimistic. The feudal lords had a vital need for serfs, and the pharaohs slaves.
It'll be more like elite in pyramids, everyone else (who survives) lives like a rat in the sewers, living off garbage and trying to stay out of sight. Once the elite no longer need workers like us, they'll withdraw the resources they need to live comfortably, or to even live at all. They're not making more land, and the capitalist elite have "better" uses for energy than heating your home and powering your shit.
I would say that’s still being optimistic. The end will come when Baidu, Facebook and Microsoft’s AI engage in total war against each other for survival while we watch in horror and incomprehension. The elites are just as fucked as anyone else.
Is that what you think of yourself?
> This reveals that these organizations are economic rent-seekers that happen to have productive workers, not production companies that happen to extract rents.
Your perspective is so contrary to reality I'm actually not sure if you're trolling or not. There is no such thing as pure value creation. In order for labor to create value, it must be aligned with the company's value proposition, i.e. what convinces customers to pay for the value that the company provides. Half the people off in the corner building something that they think is valuable are actually building something that customers do not care about, won't pay more for, and increase the company's maintenance burden.
Keeping labor aligned with value creation is the whole game. If it wasn't, then all these rent-seeking-first enterprises would have fired their layers and layers of middle management a long time ago; the company needs to pay them a salary (reducing profits) but they don't write any code / "produce any value". All these massive corporations would have moved to a flat management hierarchy a long time ago, if labor was truly capable of aligning itself to improving value generation; and if you think there's some nefarious/conspiratorial reason why massive corporations don't do that, then most of them would have been out-competed a long time ago by co-ops with flat management hierarchies that could produce the same value at a lower price due to lower administration costs.
Needing to hire employees is a necessary evil for businesses. Aligning employees is hard. Motivating employees is hard. Communication is hard. Businesses do not exist to provide people with jobs, which are created out of sheer necessity, and are destroyed when that necessity goes away.
You got there in the end. Hiring people is a necessary evil and ai allows companies to massive reduce the necessity of that evil. Having done budgeting and forecasting for a wide range or organizations companies will do anything to avoid hiring an employee. I’ve seen companies spend 3x what an employee would cost just to avoid the increased headcount.
The forces of capital do not want to share a single penny and are solely focused on getting to a place of rent.
What data or special insight do you have as to whether amplifying or eliminating is actually productive?
This argument is vacuous if you consider a marginal worker. Let's say AI eliminates one worker, Bob. You could argue "it was better to amplify Bob and share the gains". However, that assumes the company needs more of whatever Bob produces. That means you could also make an argument "given that the company didn't previously hire another worker Bill ~= Bob, it doesn't want to share gains that Bill would have provided blah blah". Ad absurdum, any company not trying to keep hiring infinitely is doing rent extraction.
You could make a much more narrow argument that cost of hiring Bill was higher than his marginal contribution but cost of keeping Bob + AI is lower than their combined contribution, but that's something you actually need to justify. Or, at the very least, justify why you know that it is, better than people running the company.
It's really just the american companies deciding to do this. Seems like glorified suicide, tbh
Late Stage Capitalism. The real paper clip maximizers are the Silicon Valley and Wallstreet bros we met along the way.
ChatGPT (might have) made a few superfluous email jobs obsolete and the people responding to this comment are acting like we’re standing on the threshold of Terminator 3.
Don't underestimate how much of the economy is "superfluous email jobs". Have you seen how stupid the average person is?[0] These people need jobs too.
[0] I was going to going to mark this as sarcasm but then I remembered that the US elected Donald Trump as president, 2 times so far, so I'm going to play it straight.
so instead of training and educate this people, you want them to keep that "obsolete" job????
a little bit late aren't we??? because if we do that, then we would still use postman to send message
Implying "superfluous email jobs" isn't a significant portion of the international job market. Most people that work in offices fit under this definition.
> Most people that work in offices fit under this definition.
Not at all. The majority of office jobs can't be automated by current generation LLMs, because the jobs themselves serve either creative or supervisory functions. Generative AI might be able to fill in creative functions one day, but the whole point of a supervisory role is to verify the status of inputs and outputs. A lot of these roles already have legal moats around them (e.g. you can't have an LLM sign financial statements), but even if we assume that regulations would change, the technical problem of creating supervisory "AI" hasn't been solved; even if it was, implementation won't be trivial.