>You should remain open to new things in this industry

I'm open to new things. I've seen demo's, attended presentations, and spent a long time toying around with it myself. I have not been convinced there is any meat there, not in it's current iteration. LLM's are designed to make things that "look" like human output and thus are very good at hiding bugs. It's ok at getting the first 20% of the project done, but that was never the hard part. It's always been the last 20%, and modern LLM's simply cannot do it. Not on large scale projects.

New things have come and gone. So far the only thing I'm convinced of is, it's easier to get funding when you can claim you use AI. That's it.

> I have never copy and pasted code into development from an LLM/AI helper

Well that's simply a different reality from what my employer is encouraging. So not relevant. They not only want us to copy-and-paste, they want us to delete otherwise functional code to make it easier to paste in AI generated stuff.

Asking questions is fine, that's much much closer to an augmented search engine than prompt engineering. You're describing something different from what this post is about.

>5 years is not what I would consider a big bargaining chip

I'm not bragging. I'm giving context. If I was 0 yoe or 20 yoe, those would be relevant too. And for what it's worth, I also started in middle school.

>one leadership away from asking their employees the same thing your employer is

Yeah that's probably true

>I'm not bragging. I'm giving context.

I didn't think you were bragging, and I hope I didn't come across as trying to put you in your place.

I'm responding with market context. The market is upended right now with no end in sight. Also, most employers if not meaningfully all, will or are involving AI. Many, if not most, people applying for decent positions right now have 3x the experience and are very willing to do whatever.

Don't let your principles end you up sleeping in your car.

> LLM's are designed to make things that "look" like human output and thus are very good at hiding bugs.

This can be true, definitely was more often true in the past. But there is a time and a place for human expression, and probably isn't in code. Your human expression is likely helped by tools. I doubt you're writing in Notepad, but your IDE doesn't get thrown out the window because it can't fully replace you or write code for you.

IF you are being blindly told to copy/paste from an LLM, then use that as part of your ideation and work from there, using AI tools as much as you can in ways that work. Become a leader in this new frontier by delving in (just kidding, that's meta about another article trending on AI)

> They not only want us to copy-and-paste, they want us to delete otherwise functional code to make it easier to paste in AI generated stuff.

Your post needs more detail if you want people to reply to your exact situation, but I think you can make clear arguments against doing this, then do this for 3 weeks, followed by the obvious: backtracking.

Leaders are by nature often encouraged to try new things. Standing in their way won't help you, but you can warn them, do it, then help them get back on track. By being a team member in this way, you are not in charge, but you can build trust equity if these leaders stick around and have techy ideas in future. In my experience, I usually outlast bad leadership (and their associated ideas). You have to be correct and not act like you're the boss to survive it, though!

Feel free to make your own decisions about this stuff but know that there are people with lots of experience and success in the industry using llm coding tools successfully (I’m one).

I am in a situation where ai was mandated, I was skeptical, but took it as a chance to try it out. I now can’t imagine going back.

Yep, if you know what you're doing, if you have good software dev and review practices and if you manage to aim the AI foot gun away from the body, then the productivity boost is absolutely gigantic.