> If war is mostly played out from a disrance
I left a company because they pivoted to exactly this. There are so many companies in this space today, testing what they call "physical AI autonomy" today, and we have to recognize that this is our today.
There are entire marketplace options for buying the pretrained, supported, private models, or the datasets if you have your own goals. If you're interested purely in ditzing around with GPS denied, or communications lost, you can do that today.
I watched a demo video, in March where a company was sharing their remote instructed (note, not controlled) multiple format (spider, dog) robot swarm. The company claimed to be 35km away from where the drones dropped off the payloads, and the mission was engaged. Lightweight explosives were used to toss off a car.
This is our present.
People saw Black Mirror and made a business plan out of it.
https://en.wikipedia.org/wiki/Metalhead_(Black_Mirror)
Also this shortfilm SlaughterBots from 2019 https://youtu.be/O-2tpwW0kmU?is=F7RNLXcVuLA5A_lA
It’s been a part of sci-fi for a long time.
It's going to happen and at some level I'd rather war casualties were measured in robots rather than people.
My concern is the cottage industry of integrating guns with half baked AI at the lowest cost. And probably vibe coded too.
The companies don't care - a sale is a sale. MoD maybe doesn't care - 90% accuracy and less human casualties on their own side are a win. Governments want to save money and by the time they find out the robots go rogue, it will be too late to do anything about it.
I can't wait for the day that killing a human-any human-is considered a war crime.
And then it will be just another war crime committed daily conflicts, and nothing will happen because there is no world police ?
Ask Ukrainians, Lebanese, Gazaoui, Somalilanders, or even Iranians for that matters - that may not make a big difference to today...
What I would love to see is a local government suing an arms producer for the efficacy of their weapons. (Or even funnier, the owner of a home destroyed by a drone, suiving the GPS company.)
We all know that the only things people in suits are really afraid of, more than hell, is a bad Q4 report and an expensive lawsuit.
The problem is always the same. It's not just MoD (is it MoW now?) that will have access to this.
YoloV8 + optical flow works fine on an esp32. You want to give a drone rough coordinates for a refinery and hit something in it, like a storage tank? That'll work. This means, give it 5 years, relatively small groups will have access to it. This cannot be stopped.
The only real answer is to work to have groups that you can trust to have access to this first.
Sadly, building an AI that analyses camera imagery and aims at humans, from scratch, is these days almost an intern project. It's not really something you can control or ban, the way you can control, dunno, uranium enrichment.
Integrating it with a robot and sticking a gun on it, thankfully, requires a bit more know-how.
> The only real answer is to work to have groups that you can trust to have access to this first.
How will this help exactly?
Friendly fire is going to get crazy. Can’t trust an LLM on its own for more than a few iterations..
Don’t worry, it will auto compact its context.
I can't wait for the Faro Plague and the robot dinosaurs.