There's also a chasm of (non-)accountability.
You or your subordinates target an elementary school: that's a war crime.
Your "battlefield AI" targets an elementary school: software bug, it happens, can't be helped.
There's also a chasm of (non-)accountability.
You or your subordinates target an elementary school: that's a war crime.
Your "battlefield AI" targets an elementary school: software bug, it happens, can't be helped.
"War Crimes" only apply to the loser of the war and are prosecuted by the victor.
Meaning whatever horrors are done on either side, only the horrors committed by the loser will be "crimes". The inclusion of AI doesn't change that.
sadly that's also true within Ukraine. like, I know that Russians are handling Ukrainian prisoners of war very brutally (no sources, why: [0]) but, if not for [0] AND if I wouldn't be killed by my co-citizens for that, I would point out a good chunk of misconduct on Ukrainian side as well.
I also recall the history lessons. I can't remember anyone who committed a war crime against Nazi Germany that also was internationally prosecuted. yep, the West did prosecute domestically, and there were some loud cases with German POWs, but I can't recall any, any Soviet soldier being charged for e.g. rape.
[0]: there is nothing public to link to that remained up, and I'm long out from private Telegram channels where such videos are posted; plus, even if I could, you and mods wouldn't want to see the video of someone getting beheaded
This isn't even that new. Part of the motivation for building autonomous nuclear response programs during the cold war was specifically to remove accountability, and guilt, from human operators. But AI does bring it to a new level.
The software is never accountable, so the human running it is always accountable.
that is how it should be, not how it is.