Ultimately, storing secrets on disk was the problem here. Never store secrets on disk. This is software engineering 101. The excuse that "we didn't know the scope of the token's access" is absurd. You knew it was a secret with access to production infrastructure, that's all you need to know.
Their provider only having backups on the same volume as the data is also egregious, but definitely downstream of leaking secrets to an adversary. The poorly scoped secrets are also bad, but not uncommon.
With all that stated... this kind of stuff is inevitable if you have an autonomous LLM statistically spamming commands into the CLI. Over a long enough period of time the worst case scenario is inevitable. I wonder how long it will be before people stop believing that adding a prompt which says "don't do the bad thing" doesn't work?
"Never store secrets on disk."
Wait till you learn how that API stores cryptographic material.
What's your point? Obviously, a secure server storing encrypted data on disk in a manner where it is only accessible through a secured API is not what is being discussed here.
how do you think the LLM will do required operations when the secrets are stored somewhere other than the disk? It will still need to get them just like the application gets them when it has to do work.