It's not the same.

The pilot becomes responsible for those outcomes. For example indiscriminately killing civilians for example is a war crime. Its easier to get an AI to commit war crimes than humans.

Perhaps but if the difference is significant I don't know. Everything changes then we try stretch rhetoric from stabbing someone with a sword to hypersonic missiles? We might hold the pilot responsible if they erase a building but I'm far less comfortable blaming them. We know the targets are actually picked by computers using metadata. The difference gets increasingly vague.