It's far more exciting than sad.
Got an idea that you'd need assembly language for - now you can do it instead of..... never doing it because it would have been impossible for you in any practical way.
Look to the positive instead of lamenting something that never would have happened.
It's unbelievably exciting that you can now program a computer virtually without the limitation of your ability to hand code it.
The result is unimpressive either way -- it's the journey that is exciting for these kinds of projects
I understand for some people its the display of human wizardy that matters.
For me it's about making the computer do awesome things - I do not care how I get there I just want it to do whatever I can conjure in my head.
As much as I enjoy the novelty of asking anime pictures from chatGPT I do not, for a single moment, consider myself a doer of anime pictures.
And a fair aside, the result will be "good enough" approximation of what I conjured in my head, but never the thing itself. For me to do the exact thing I conjured in my head it will require to pick up the mouse and draw the rest of the owl. I don't know if that's more telling of my imagination being demanding or my standards.
True if you use only chatGPT to do something and accept the generated stuff as the final output.
Probably not the case for anime pictures, but in other domains, you can use chatGPT as a first level and then go on the improve it from there. To make a parallel: if you draw with a pencil on a piece of paper, you would still think of yourself a doer even if you did not manufacture your pencil or paper.
There's still personal skill expression in driving cars and using a pencil for drawing, that makes the difference between drivers and artists visible enough to justify hiring one over another.
So far I can't say the same for leveraging LLM's and, in the off-chance that there is, we have an entire software development industry that doesn't even know how to filter for "it".
It's usually not even the display.
When I go on a trek, the end of trek landmark is nowhere nearly as significant as the experience of reaching it.
If I were to be magically transported there without the lives experience it would take almost all of the joy out of it. Some people get a kick out of doing hard things that are interesting but seemingly beyond one's ability. Making it an easy commodity spoils the fun.
As for teleportation, if it were, say, trip to moons of Saturn I can make exceptions.
Nah. I'm not going to yearn for the days of hitting steel on an anvil when we can have steel produced in a factory.
Have you ever done blacksmithing? It’s tremendously satisfying.
Sure, if you want 300,000 spoons, it’s far better to use a factory process and get essentially identical results. But if you only want a few spoons and accept (or even value) that the spoons will all be a little different, hand-forging them is quite enjoyable.
I’ve written enough assembly and done enough blacksmithing to know that the metaphor isn’t quite apt. But there’s both tremendous effort and satisfaction involved in both.
Hobbies are great. Making a living is usually a separate endeavor. I don’t want to pay for an artisan spoon. There will be a limited market for artisan software. But make no mistake, we are entering the era of software mass production. Say goodbye to your chisels and rasps. If you want to make money you will need to operate the machine that builds the machine.
You won’t be able to enjoy your free time playing with computers if anthropic et al make you jobless.
The “you” doesn’t necessarily refer to you. Im addressing 90% of the developers out there. We love playing around technology… but I doubt we will be thinking the same once we become unemployable. But here we are, having fun with the tools of companies that want to finish us. How ironic
Fiddling while our home burns has been a beloved pasttime for many people, from emperors to passengers on ocean liners to prisoners in camps. What else is there to do anyway? Sometimes history must take its course before humanity as a whole recognizes its folly.
This is silly.
While I work in IT today, that wasn’t always true. I am certain I spent more free time playing with computers when my work did not involve computers at all. While I enjoy working with computers at a variety of different levels, when I do it all day, I don’t typically wanna do it when I get home. If Anthropic means there are no more IT jobs, software jobs, etc. etc. etc. (which I think is highly unlikely) then I guess I will have to do a non-tech job just like 99% of the other human beings. If that comes to pass,I expect in my spare time I will suddenly reacquire a love for tinkering with computers.
Also consider the great fortune of dicking around with computers ever being so lucrative in the first place even if the gravy train eventually stops. We were lucky. Most hobbies aren’t anything like that.
> Got an idea that you'd need assembly language for - now you can do it instead of.....
Nobody actually needs a web server built in assembly language, it serves no practical purpose. And I say that as someone who learned to program 6502 assembly language in 1983 and has sporadically used assembly of various architectures since.
The absurdity of building it would have been the curiosity draw pre-LLMs, but when it existing is just a series of prompts away it really loses all of its meaning.
But yeah... hooray for AI. Can't wait until we learn to harness it to supercharge the most important and valuable thing we do as a human society in modern times: stuff increasingly intrusive ads in front of everyone at all times.
> Can't wait until we learn to harness it to supercharge the most important and valuable thing we do as a human society in modern times: stuff increasingly intrusive ads in front of everyone at all times.
Wasn’t it used for that before anything else? Google invented transformers and had LLMs internally before chatgpt got released. Presumably they were using them for ads, because their public demos were insane things like talking to the moon.
> Wasn’t it used for that before anything else? Google invented transformers and had LLMs internally before chatgpt got released.
According to friends who worked at Google (no direct knowledge myself, so don't know exactly how true it is), they mostly sat on the tech. Google News had internal prototypes of using them to expand/contract/summarise and/or add details/context to news articles and translate them to different languages, but it was never fully productised.
Then after ChatGPT got popular, sudden panic to start using them in products company-wide.
It has always been possible to do it. LLMs are not a particular enabler for that.
The difference is that now it is worthless: there is no learning, no person caring about the result, nothing aspirational for the public to look towards... we used to enjoy those challenges, used to be proud of solving complex problems... now? Yeah, whatever, execute execute commit push, let another LLM "review" and call it a day.
The difference is not that it’s “worthless”. The difference is that now it’s “practical” to implement given the low effort.
I wouldn’t be sad about defeating lower complexity challenges. There are always higher complexity challenges that arise once we start operating in a world when you can do more. The bar raises.
The point is the death of the celebration of excellence and technical mastery.
Once insurmountable challenges are now trivial to implement with, as you say, "low effort."
For those who were attracted to computing by the grind and the grand narrative that you, too, with sufficient effort, discipline, and merit, could become a revered craftsman, LLMs trivialize an entire lifetime of practice. I can't think of anything more demoralizing.
If your goals were fame, then yes. But you can still pursue excellence even if there is an alternative “easy” path.
The equivalent is something like hand tool woodworking - it’s still a thing despite the advent of machines, but more of a niche. You can still aim to become excellent, but maybe you won’t be famous.
> but maybe you won’t be famous.
Or employable. Which sucks if you're over 50.
That also sucks if you are not anywhere close to retire or having a beffy bank account and depend on regular monthly payments.
Did hammers obviate the technical mastery of finding a suitable rock? Or did they elevate the definition of “technical mastery”?
llms are nothing like hammers or other tools.
They are factories that product goods on a whim. There is nothing to compare them to as we never had anything like that. This is not industrial revolution this is obliteration of work at its core.
I look at them as lab grown bacteria. We’re in the early days and still have a lot of contamination we still don’t understand. They don’t always produce a viable result, and sometimes they break test rigs.
Just because they’re not a pure extension of our bodies or minds like a hammer or pencil doesn’t mean they will magically break the concept of work.
Would you apply the same reasoning to the building of horse drawn carriages and mass produced motor vehicles? A hand built PDP-11 to a Thinkpad?
[dead]
No, increasing the offer of something decreases its value, always. Do not necessarily increases its demand. That is basic economic rule. See that I use "value", not "cost". The distinction matters.
Yesterday I went to a bookstore: saw an interesting book cover then I thought "ah, looks like AI"... all excitement went away. There won't be a "new complexity frontier" for artists that used to draw book covers. Or writers, actors, writers, etc.
AI is currently not enabling any use case which previously was "too hard". It is just reducing the value of stuff by increasing the offer and making people delulu about what they can achieve without proper knowledge.
Making good stuff requires paying attention to a lot of details. Even "simple" stuff can become incredible complex once you actually learn about how it must be done. Most of what we humans do is working on that space, not chasing projects Manhattans.
What do we get if population is disconnected of the true complexity of creating stuff? Perceived value decreases and if everything is perceived equally bad people will stop caring about quality. That is why fascism likes uneducated people.
So, that is about the AI contribution to "value" itself.
Now, is it true that AI will allow us to create more complex stuff that is not practical now? I would strongly disagree. The reason is Kolmogorov complexity: it is not possible to find the shortest program that describes a task. Describing it with natural language will not magically give us permission to avoid having to describe that complexity. What is the point of switching from C to English, if I still have to specify every little detail in a much ambiguous and verbose language? Programming languages are not the challenge, they are the solution to the problem of having to specify complex tasks in a reproducible way.
Now gathering everything together: that is why I think that generative AI makes things worthless: value reduction, complexity perception reduction (which reduces value), a population ignorant of the complexity will choose subpar options because "they are all the same garbage" and we will not get any superior engineering capability anyway.
> The difference is that now it is worthless
Writing whole software projects in assembly has been worthless and pointless for a couple of decades now. Even the projects who can put together a solid case will limit assembly to very specific components executed only in specific bits of a hot path. Perhaps the most performance-sensitive code we have today is high frequency trading and that field is dominated by C++.
Also, virtually all mainstream compiler suites have flags that output assembly,and that feature is largely ignored and unused.
That's just not true... the flags to get preprocessed output and assembly are quite useful and used a fair bit, in fact. Multiple reasons - sanitychecking your code, finding bugs, or even finding compiler errors.
The point is that these projects had worth because of what the programmer got out of the learning process, not because of the end result.
A lot of FFmpeg is written in assembly, and a lot of things are using FFmpeg in the backend.
Yep, another humane thing going to get killed, because people are naive, gullible and basically idiots handing out their expertise on a platter to faceless corpo entities.
What's next, human human contact abstracted away by brain stimulation?
And the transhumanist arsewipes gonna have a field day.
Never too late to ignite the nukes...
> What's next, human human contact abstracted away by brain stimulation?
Of course! Corona/junta/scarecrowvirus don't transmit over the wire, while ads, taxes and surveillance do alright!
If you've got an idea that you need assembly language for, you can use a compiler to create that assembly language. It'll probably do a better job than an LLM. Assembly projects are interesting because they're written in assembly, not because they contain assembly.
You'd be surprised, again.... most compilers don't generate very good code, mostly because
1. the time for optimisation is limited
2. the constraints are overlapping and just completely intractable beyond a single function (do you want to inline this, saving on the call and increasing binary size, or not do it because it's cold?)
3. they don't have domain-specific knowledge about your code, and even with PGO, they might incorrectly decide what's hot and what's not - typical example are program settings. You didn't enable a setting during PGO instrumentation, compiler sees you didn't call that path, shoves it out of line. Now your PGO-optimised code is worse than -O2. And compilers have different levels of adherence to manual branch hinting - on MSVC you get a reorder at best, Clang and GCC try much harder at [[likely]] and [[unlikely]].
4. There's still quite a bit of low-hanging fruit left, mostly because progress is jagged ;) For example our calling conventions generally suck - this is actually why inlining is so helpful - and the inertia makes everyone emit the default calling convention and that's it.
For example, did you know that compilers have very inconsistent support for struct unpacking? It can be much faster to write
than because the first one goes through registers on the MSVC ABI, the second one gets lowered to the caller passing a pointer to the stack. Before someone says "oh this just means MS sucks" - fair, but for std::unique_ptr the situation is the other way around... on the MSVC ABI the callee cleans it up so it's truly zero-cost, but on the Itanium ABI using it is worse than using T* as a raw pointer... see the GCC codegen :)These examples might seem a bit cherrypicked but this is only scratching the surface, not to talk about the codegen in higher-level languages, which is even more dreadful. Manually optimising your code can usually get a magnitude worth of free performance, which is just tragic.
I wouldn't even rule out LLM codegen in the future - although they're quite unreliable today so you'd get miscompiles like crazy - but there's just so much low-hanging fruit left on the table that it wouldn't be too out of step...
Expanding the struct to two arguments does not take longer than rewriting your whole project in assembly.
I've never said that, but using assembly in certain places can certainly be justified, especially for the performance-intensive parts.
I think for programmers the enjoy is to write it by his own, not to just have a toy. If I just want a web server in asm the easiest is to just decompile an existing one into assembly and call it a day.
Only exciting if you already got a lot of programming under your belt, like Carmack, or a product guy.
> without the limitation of your ability to hand code it.
Isn't that kind of view pathetic and sad, though? Why would anyone pick up and guitar or play a piano if they could just listen to the same song already made by someone else? I struggle to understand this view of people that pretend to not understand why being an expert of some skill is perceived as valuable by some people. This is also belies next problem with this line of thinking which is that it says "we don't need to learn X to do Y because we have AI" but misses the same AI could easily replace the need to have you think to do Y in the first place. I don't know.
In my experience people that did not hand code enough imagine that the hard part is coding, and not clearly defining all the possible edge cases and use cases.
So, in my view, more people will (or should) understand now what is hard when building complex things, if they pass the stage of "I have a nice POC that works for this one case".
I don’t know. The edge cases become very clear when coding, because there are explicit and precise guarantees of how the code will behave that you can reason about (or when there aren’t, you know that you are in trouble and code defensively around it, and you can reason about that defensive code), which isn’t the case when vibe coding. When coding, you can prove the code correct in your head, and the edge cases are revealed by that process. But that process isn’t possible with vibe coding, because what exactly a prompt will produce isn’t predictable, and doesn’t have guarantees attached that you can reason about.
Completely agree. My point is that they will start realising it not when vibe coding (as you say, nobody will ask them for clear specifications) but as soon as they will try to use it for more than one happy path demo. Then they will "patch" one case and break another and so on. And then they will probably complain LLM-s are crap at coding, same way some lazy product manager complained before when asked questions that they did not think about ...
I see. Some people have the opinion that the things that are hard about hand-coding remain essentially the same when LLM-coding. They argue that this ensures their employment despite the shift from hand-coding to LLM-coding, and that the necessary expertise and diligence (e.g. appropriate architecture and test coverage) remains the same, and in consequence quality doesn’t suffer. I thought that you were arguing in that direction.
> Got an idea that you'd need assembly language for - now you can do it instead of..... never doing
But you're not doing it. The ai is doing it.
If the op can write a web server in assembly language then I'm pretty sure they could have done it in a higher-level language. But they did what they did for the journey and the learning along the way. Vibe coding it omits all that, and misses the point of the exercise.
I do believe this is just a next step in languages. We've come this far trying to make code NLP, now we have the closest thing to a translator in our generation. It's an exciting time, just don't pay attention to talking heads.
Which is why now companies can happily reduce head count.
This idea that LLMs are going to enable us to do things that we wouldn't have done before, therefore overall productivity and value is going to increase "exponentially" seems naive to basic economics.
If LLMs are good for doing things we aren't already doing, it indicates the overall addressable "value" that LLMs could provide for such things is actually quite low. If the task has necessary prerequisites that you don't currently possess, but you haven't spent the effort to jump that hurdle yet, then it's a good indication the value of completing that task is very low. Even if, maybe especially if, we're talking about personal projects where the value proposition is personal and not momentary, it indicates the person already feels in their bones that the return on doing this thing is not worth the effort.
I'm struggling with this with my leadership at work. We have developed a thing that is going to remove the need to hire temps [0] for data entry when we get clients who send us large amounts of their "data," aka "a thousand 30-slide PowerPoints each with one line graph of interest sitting in the corner of one slide somewhere." It is an ask that comes up a lot, it's always very expensive in both the time and money axes for the client for that task, but overall it's just a small part of the contact budget. I'm all for using what we've built to cut down the time cost for our clients, but my leadership thinking it's going to lead to massive cost savings for our clients seem to forget just how much time we spend in meetings and planning and documentation and testing and reevaluation and more meetings versus actually executing on things.
It's also bad business. To me, giving results faster should be a premium offering. We should be charging more, not less.
[0] We don't actually hire temps, we turn our junior data analyst into temps by burning them out with tons of unpaid overtime. They then leave and we have to back fill them at rather extreme cost of overhead for hiring compared to the direct contract overtime we didn't provide.
the biggest issue with llms is that they make those who have no idea what they are doing seem like they know something.
>> without the limitation of your ability to hand code it.
yeah its nice though this in 100% of cases results of software of even lower quality we had before.
so hard to tell where is the win here. the fact that you can generate some code does not make it a win, just a curious fact.
> Got an idea that you'd need assembly language for - now you can do it instead of..... never doing it because it would have been impossible for you in any practical way
If you are having an LLM generate the assembly language for you, that is not even remotely close to writing the assembly language yourself.
I don't find it exciting even in the slightest. I can think of nothing more boring and unsatisfying than having an LLM generate all of your code for you.
I mean, I understand why some think this could be exciting from a "I can get something done fast because the LLM generates it for me" standpoint -- because their excitement stems from something getting done at all instead of just sitting in the pool of ideas forever. However, you will never know the code generated by an LLM like you know the code you wrote yourself. Also you will never gain the same satisfaction of finishing a project where the code was written by an LLM that you gain from finishing a project where you wrote the code yourself.
If you are a person that doesn't care about coding or doesn't like to code at all, I could totally see why you'd find this exciting - to you it's all about avoiding work you don't care for or want to do yourself anyway. Also, a high percentage of people who do love coding have zero interest in writing assembly language, so if they were required to write some for a project, I could also see them being happy with having an LLM generate that part of the project for them.
However, I think for people who genuinely love to write code, the situation is the opposite of what you said -- it is far more sad than it is exciting. In fact, for many of them it has already reached the point of depressing for many reasons. I don't think it is primarily because the LLMs have gotten significantly better at generating code (which they have). I think some of the bigger reasons are that so many people who now pay people to produce code have:
1) got a very short-sighted and "rose-colored-glasses" view of what LLM-produced code will do for their company.
2) deeply under-appreciate the value of having a person or team of persons who understand their business, the hardware and software required to support their business, and the work required to both keep things running and handle new requirements as they come along. Because of that under-appreciation, many already have punted ( and/or are preparing to punt) those people to the curb because they think they can just have an LLM do their job and save a ton of money.In the long run I think most (if not close to all) of those businesses are going to be sorry if they over-indulge in replacing human-produced code with LLM-produced code. I think the ones who lean too heavy on the LLM side are going to eventually collapse into a heap of unmanagable dumpster-fire code that they can't understand nor maintain. A whole new world of incidental complexity will consume every project, and in the long run it will just eat them alive (figuratively speaking, of course :-D ).