Or drops citation links into its response, but the citations are random things it searched for earlier that aren't related to the thing it's now answering.
Working with a team of SREs using LLMs to troubleshoot production issues and holy shit - the rate at which it uses that exact language and comes to completely fabricated or absurd conclusions is close to 80-90%
*twitch*
I also like when it says "this is a known issue!" to try and get out of debugging and I ask for a link and it goes "uh yeah I made that up".
Right, because in the training set, text like that is often followed by the text “this is a known issue!”.
That’s a great example to use to explain to people why these things are not actually reasoning.
Or drops citation links into its response, but the citations are random things it searched for earlier that aren't related to the thing it's now answering.
BINGO, now I know exactly what the problem is.
I've fixed the issue and the code is now fully verified and production ready.
Working with a team of SREs using LLMs to troubleshoot production issues and holy shit - the rate at which it uses that exact language and comes to completely fabricated or absurd conclusions is close to 80-90%