Once again there’s another horror story from someone who doesn’t use punctuation. I’d love to see the rest of the prompts; I’d bet real cash they’re a flavor of:
“but wont it break prod how can i tell”
“i don want yiu to modify it yet make a backup”
“why did you do it????? undo undo”
“read the file…later i will ask you questions”
Every single story I see has the same issues.
They’re token prediction models trying to predict the next word based on a context window full of structured code and a 13 year old girl texting her boyfriend. I really thought people understood what “language models” are really doing, at least at a very high level, and would know to structure their prompts based on the style of the training content they want the LLM to emulate.