I think this is a bit like attempting your own plumbing. Knowledge was never the barrier to entry nor was getting your code to compile. It just means more laypeople can add "programming" to their DIY project skills.
Maybe a few of them will pursue it further, but most won't. People don't like hard labor or higher-level planning.
Long term, software engineering will have to be more tightly regulated like the rest of engineering.
> Long term, software engineering will have to be more tightly regulated like the rest of engineering.
That's for sure. Software is too important for not only the economy but for people's safety that it needs to be regulated. Adding ads to all software is making it fragile and prone to hacking. Bloating it with features to sell stuff instead of doing its primary function is equally bad for everybody. Don't let me start on bad engineering practices.
I agree with the first part of your comment, but don't follow the rest - why SE you should be more tightly regulated? It doesn't need to be; if anything, it will just stifle its progress and evolution
I think AI will make more visible where code diverges from the average. Maybe auditing will be the killer app for near-future AI.
I'm also thinking about a world where more programmers are trying to enter the workforce self-taught using AI. The current world is the continued lowering of education standards and political climate against universities.
The answer to all of the above from the perspective of who don't know or really care about the details may be to cut the knot and impose regulation.
Delegate the details to auditors with AI. We're kinda already doing this on the cybersecurity front. Think about all the ads you see nowadays for earning your "cybersecurity certification" from an online-only university. Those jobs are real and people are hiring, but the expertise is still lacking because there aren't clearer guidelines yet.
With the current technology and generations of people we have, how else but AI can you translate NIST requirements, vulnerability reports, and other docs that don't even exist yet but soon will into pointing someone who doesn't really know how to code towards a line of code they can investigate? The tools we have right now like SAST and DAST are full of false positives and non-devs are stumped how to assess them.
Even before all this latest round of AI stuff it's been a concern that we overwork and overtrust devs. Principle of least privilege isn't really enough and is often violated in any scenario that isn't the usual day-to-day work.