I can't wait until the AI people realize that without developers' original ideas, AI has nothing new to steal from. We don't create, AI will spit out the same old concepts. What, you're gonna create the next generation of AI by training it on what the very same AI has already produced? C'mon now.
You don't get technical creativity reflexes by using AI. This is technical stagnation in the making. By cannibalizing its own sources, AI is ensuring that future generations are locked-in subscription models to do the most basic technical tasks. This is all obvious, yet we speed up every chance we get.
It might be a mistake to assume tomorrow’s training looks like today’s. Unsupervised learning is a thing and a very hot research topic, precisely because it avoids some of today’s big problems with acquiring the vast amounts of training data necessary.
Unsupervised leanring has been around for years and is already how the current wave of models are trained. It doesn't mean no data, it means no human provided labels of the data. So you still need creative new human ideas to move LLMs forward. LLMs != intelligence.
Exactly this. Even frontier models like Opus 4.6 have absolutely zero understanding of the task at hand. If you give them problem they have not encountered in training data, they will not solve it. You can however, guide them to resolve the problem, but in that case these get reduced to merely an auto complete. Don't get me wrong - models are getting better and hide very well that they don't understand anything, you can almost get fooled now.
Maybe there aren’t that many new/necessary ideas that can be mined from the fundamental building blocks of software development (languages, syntax, runtimes, static analyses, type checking, etc). Maybe people will continue to innovate by instructing models to build novel things out of those building blocks? Perhaps things we would not have thought of building before due to the effort required without LLM assistance.
It's the opposite. The less competent the average developer the more valuable coding LLMs become (as the only way for those bad developers to generate ok code). Eliminate the good developers and even bad coding LLMs become valuable.
I don't expect them to realise that until some time after it actually happens. When it remains a future hypothetical, it won't be accounted for.
https://en.wikipedia.org/wiki/Profession_(novella)
fwiw 90% of software is reinventing the wheel. 80% of devs have an itch to "rewrite from scratch".
AI will deduplicate all of this
My experience is that 100% of AI devs are reinventing the wheel, most of the time for no better reason than "I can do it" or "not invented here"
I opened LinkedIn today (out of habit) and the first post was someone explaining how much Slack costs and that with AI every company can build their own Slack for $100. So that person decided to build an open source Slack clone using Claude Code. Granted, these were a few sane comments showing good alternatives that have already been built.
But for me it's been a signal that people have no imagination, so they are just burning tokens for no reason.
This is fine. How else do you learn but by taking things apart and rebuilding them? This obsession with productivity is incompatible with onboarding new talent. Having 1000 versions of the same concept is exactly what progress is.
> 1000 versions of the same concept
That sounds beyond wasteful.
It is. Humans are messy.
Why would there be a lack of original ideas? People who are born to code so to speak will do it. Information wants to be free as the saying goes. It only takes one time for an innovation for it to be to copied everywhere.
We don’t need the same volume of developers to have the same or faster speed of innovation.
And conversely if there is stagnation there is a capital opportunity to out compete it and so there will be a human desire to do the work.
Tl;Dr. People like doing stuff and achieving. They will continue to do stuff.
ps it’s too much to claim other people don’t experience creative ideas using AI. You don’t really know that’s true. It hasn’t been my experience as I have had the capability and capacity to complete ideas on my back burner for decades and move onto the next thing.
That’s the big scary point at the crux of all of this - you’ve had decades without the tooling to develop instincts. Nobody knows whether it’s possible to develop instincts with the tooling or what those instincts will look like. Creativity takes a degree of skill to execute on and the concern is that we’re potentially graduating people to painting the ceiling of the Sistine chapel before they’ve even learned to sketch.
At minimum, our current generation of leaders will have to get much better at managing resources and building people up. We have to up our games and build environments where the pursuit of deep understanding is permissible. Unfortunately with the current hiring issues, it’s totally understandable that young developers are scared to take time on tickets.
I can't repair my car, which used to be the hallmark of technical masculine skill in the era of Grease the musical, because mechanical maintenance is not primary in my lifetime. Nor do I have any idea how to manage a farm. I think the kids will be fine. On the other hand, I fear I will not survive once my Internet connection goes out.
It isn’t about training anymore. It is about harnesses.
Just look at new math proofs that will come out, as one example. Exploration vs Exploitation is a thing in AI but you seem to think that human creativity can’t be surpassed by harnesses and prompts like “generate 100 types of possible…”
You’re wrong. What you call creativity is often a manual application of a simple self-prompt that people do.
One can have a loop where AI generates new ideas, rejects some and ranks the rest, then prioritizes. Then spawns workloads and sandboxes to try out and test the most highly ranked ideas. Finally it accretes knowledge into a relational database.
Germans also underestimated USA in WW2, saying their soldiers were superior, and USA just had technology — but USA out produced tanks and machinery and won the war through sheer automation, even if its soldiers were just regular joes and not elite troops.
Back then it was mechanized divisions. Now it is mechanized intelligence.
While Stalin said: Quantity has a quality all its own.
There is no "new ideas" with AI. Claiming the opposite is a fundamental misunderstanding of the technology.
While that’s kind of true in some sense, I think there’s an argument to be made for the contrary: that the mechanism for generating new ideas in humans is not quite as special as we would like to think.
In other words, creativity in humans is arguably just as derivative as in machines.
I think this can be falsified by just considering the history of humanity. It wasn't that long ago that human language literally did not even exist. And our collective knowledge wasn't all that much more than 'poke him with the pointy end'. Somehow we went from that to putting a man on the Moon, unlocking the secrets of the atom, and more. And if you consider how awful we are at retaining/sharing information and just general inefficiencies due to the fact that we're humans and not just logical information processing machines, we did all of this in little more than the blink of an eye. This is something that seems to certainly be rather special.
All that humanity has achieved happened due to the simple loop of identifying a desire/need and finding a way to satisfy it. Also known as reinforcement learning. The only thing that really differentiates humans from machines is... history. We've been learning and passing on our knowledge to successive generations over millennia. Nothing really special there; give the machines a few years to learn and see what happens.
What needs do machines have? What desires do they have?
I’m not claiming an LLM is structurally or functionally equivalent to a human brain. I just said that what we call “creativity” is in fact a very derivative thing.
I hear this sentiment a lot but it doesn’t ring true for me.
What is an idea really and what’s your definition of new?
If i get a LLM to spit out, I dunno, a deployment system written in haskell that uses bittorrent or something, none of those bits are new, but certainly there will be unique challenges to solve in the code and it’s a new system.
Where is the line for new? Is it in combining old ideas? If not then does any software have “new” ideas? It’s all combinations of processor instructions after all…
What I am excited about is the possibility of LLMs to draw conclusions from the last 150years of scientific papers.
There have been lots of instances of knowledge being rediscovered even when it was previously published but sitting on some shelf forgotten. LLMs ability to digest large volumes of data will I think help with this issue.
We will still need to reproduce and verify conclusions but will be interesting to see what might come from this.
i don't think all sides of this discussion agree on what a "new idea" is. i am a very creative person but i've never had a truly original thought and i don't know how having one would be possible
It depends on what layer you look at I think, shoulders of giants and all that..
that's only partially true.
AI can innovate in synthetic-realm of novel ideas, while real-world novelty will remain untouched.
There are different types of novelties
If AI could innovate it wouldn't be a public product. It would be a cash cow. Why give your customers the ability to come up with new and amazing ideas when you can just keep it for yourself and launch a thousand products? USA is a capitalist society. It doesn't share profitable ideas.
And if AI was really about productivity they'd be talking about doing more faster with the same workforce, not reducing the workforce.
if you like, the business model is called Innovation-as-a-Service :)
That's perfectly aligned with capitalistic motivations
What is a "real-world novelty" and what prevents AI from touching it?
Innovation is irrelevant to pushing up this quarter's numbers. No one actually values unique and novel ideas. The only thing that matters is shipping something right now that can make an impact on this quarter's numbers.
Who cares if it's derivative slop or a straight up bootleg of something else so long as the number goes up
[dead]
Technically all the problems that almost any given business needs to be solved today has already been solved umpteen times over the years. There are no new problems that can't be solved by porting and/or combining old solutions.
"Everything has already been invented." - Some 19th-century scientist who had no imagination to see the wave of technological innovation that was coming.
That's the literal definition of stagnation. That is not compatible with growth.
Also that's not a new idea, that "everything worth inventing/exploring has already been". It's precisely what AI reinforces, and that goes against human nature (and capitalism) as that statement has historically proved.