I am hobbyist playing around. Recently dropped CC (which gave me a sense of awe 2 months ago), but they realized GPUs need CapEx and I want to screw around with pi.dev on a budget. Then on to GH Copilot but couldn't understand their cost structure, ran out of quota half month in, now on Codex. I don't really see any difference for little stuff. I also have Antigravity through a personal Gmail account with access to Opus et al and I don't understand if I am paying for it or not. They don't have my CC so that's a breather.

It's all romantic, but a bunch of devs are getting canned left and right, a slice of the population whose disposable income the economy depends on.

It's too late to be a contrarian pundit, but what's been done besides uncovering some 0-days? The correction will be brutal, worse than the Industrial Revolution. Just the recent news about Meta cuts, SalesForce, Snap, Block, the list is long.

Have you shipped anything commercially viable because of AI or are you/we just keeping up?

> The correction will be brutal, worse than the Industrial Revolution.

Has it occurred to you that there might not be a correction, and that the outcome would still be brutal, at least on par with the industrial revolution.

It won't get that far.

It's physically impossible to build out the datacenters required for the "AI is actually good and we have mass layoffs" scenario. This Anthropic investment is spurred on because they've already hit a brick wall with capacity.

$40B goes a long way, but not for datacenters where nearly every single component and service is now backordered. Even if you could build the DC, the power connection won't be there.

The current oil crisis just makes all of that even worse.

We pretty much already had the layoffs, at least that's my perception.

The next level of layoffs is probably still 25 years out.

> The next level of layoffs is probably still 25 years out.

Hasn't even been 25 years years since the previous layoffs before the current ones.

There's layoffs, certainly.

But all the economic indicators suggest those are "bad economy" layoffs dressed up as "AI" layoffs to keep the shareholders happy.

Do you mean as in there will be no happy ending / reset and no another century of prosperity?

I mean as in living through the industrial revolution would have been wild. So whether we have an AI revolution or an AI bubble it's bound to be a roller coaster.

And that's without accounting for the various wars (and resultant economic impacts) that are already in progress. A large part of what drove the meat grinder of WWI was (very approximately) the various actors repeatedly misjudging the overall situation and being overly enthusiastic to try out their shiny new weapons systems. If one or more superpowers decide to have a showdown the only thing that might minimize loss of life this time around is (ironically enough) the rise of autonomous weapons systems. Even in that case as we know from WWII the logical outcome is a decimated economy and manufacturing sector regardless of anything else that might happen.

> minimize loss of life this time around is (ironically enough) the rise of autonomous weapons systems

I think that just means the relative civilian loss of life will increase once again.

What strategic merit is there in targeting civilians or life critical infrastructure in a fully automated battlebot scenario? Perhaps it's naive but I would expect stockpiles, datacenters, and any key infrastructure on which the local semiconductor fabrication depends to be the primary targets.

The current reality doesn't match your expectations. Russia is using automated warfare to strike what are primarily human life-critical targets.

Look au Ukraine for answers and how russians target almost purely civilian infrastructure and civilians in terror campaigns every single day and night, same as nazis did to Britain in WWII. With exactly same results but they just double down and send more drones next day.

russia is really and empire of the dumb and subjugated serfs at this point (again, history repeats), but they are far from only such place.

Dont expect more, most people are not that nice when SHTF.

Bubble or revolution - not a dichotomy.

Bubbles like the AI bubble are a game theoretic outcome of a revolution. Many players invest heavily to avoid losing, but as a whole the market over invests. This leads to a bubble.

Imagine you're a typesetter and they just invented computerized printing.

There has always been a gap between the experience of solo/small shop developers, vs. developers who work in teams in a large corporate environment. But thanks to open source, we have for the past twenty years at least mostly all been using the same tools.

But right now, the difference in developer experience between a dev on a team at a business which has corporate copilot or Claude licenses and bosses encouraging them to maximize token usage, vs a solo dev experimenting once every few months with a consumer grade chat model is vast.

Let’s take an extreme example.

Meta seemingly has a constant stream of product managers. If llm’s really augment the productivity of engineers, why isn’t meta launching lots more stuff? I mean there’s no harm in at least launching one new thing.

What are all those people doing with the so called productivity enhancements?

What I’m calling into question is how much does generating more code matter if the bottle neck is creativity/imagination for projects?

The only thing I’ve seen is a really crummy meta AI thing implemented within WhatsApp.

It’s allowed a sludge of internal tools to spin up, and more bloat. The ability to sand bag and over build these tools has gotten 2-10x worse.

Only solution I can think of is to drastically cut headcount so productivity is back to prior levels, and profitability is raised. Big Tech is mostly market constrained with not much room to grow beyond the market itself growing.

As for startups, seems like AI tools have drastically reduced their time to market and accelerated their growth curves.

Forgive my ignorance, but what exactly is the vast difference? Who's doing more of what, or whatever you're implying? And how do you quantify this?

The difference is (if you'll forgive me recruiting a couple of straw men for the purpose of illustrating the spectrum we are talking about here):

Hobbyist solo dev, counting tokens, hitting quotas, trying things on little projects, giving up and not seeing what the fuss is about.

vs

Corporate developer, increasingly held accountable by their boss for hitting metrics for token usage; being handed every new model as soon as it comes out; working with the tools every day on code changes that impact other developers on other teams all of whom have access to those same tools.

Okay, so just to be clear you're not commenting on productivity? Or what does "changes that impact" mean?

I might be missing a lot of self-evident assumptions here but I feel like I'm still missing so much context and have no idea what this difference is actually describing.

If you have some objective measure of productivity in mind, feel free to share it, but no that's not what I'm commenting on.

I'm talking more about why threads like this seem to be full of people saying 'this has completely changed how corporate development works' and other people saying 'I tried it a few times and I don't get the hype'