I wish there was some sort of community project where engineers could whistleblow about their product falling apart through misguided AI pushes.
I see it everywhere in my private circles, I'm not sure the story is truly reaching the big public.
I've gone through many many fads and smoke during my career, but this is the first time I'm actually worried about things falling apart.
>I wish there was some sort of community project where engineers could whistleblow about their product falling apart through misguided AI pushes.
It would be an awesome thing to see. But would need to be hosted in another country like PirateBay
Also, what is their incentive?
Yeah, it is wild seeing with my eyes how bad these tools are in a lot of cases. We do have some vibe coders on our team but they basically are banned from my current project because they completely ruin the design and nuke throughput. HN would have me believe I'm a Luddite who shouldn't be writing code, however. I truly do not understand how to reconcile this experience and many times it is too complicated a topic to explain to someone who isn't an engineer. AI is the uiltmate Dunning-Kruger machine. You cannot fix what you do not know because you do not know that you did not know.
As you say, I think things are just going to fall apart and we're just going to have to learn the hard way.
No, these tools are really great in a lot of cases. But they still don't have general intelligence or true understanding of anything - so if people using them wrong and rely on their output because it looks good and not because they verified it, then this is on the people using them.
I mean, that is fine, but then it seems like people at large are not using them "right". I think you'll find that since these tools are convenient and produce a lot of code in terms of lines, that verifying goes out the window. Due diligence was hard before these tools existed.
Oh I do find it certainly tempting to get lazy with these tools, but I did learn that there are sideprojects, where vibecoding is fine - and important codebase, that can be improved with LLM's - but not if you just let agents loose on them.
fatbabies from the dot com days