> If you haven’t spent at least $1,000 on tokens today per human engineer, your software factory has room for improvement
At that point, outside of FAANG and their salaries, you are spending more on AI than you are on your humans. And they consider that level of spend to be a metric in and of itself. I'm kinda shocked the rest of the article just glossed over that one. It seems to be a breakdown of the entire vision of AI-driven coding. I mean, sure, the vendors would love it if everyone's salary budget just got shifted over to their revenue, but such a world is absolutely not my goal.
Yeah I'm going to update my piece to talk more about that.
Edit: here's that section: https://simonwillison.net/2026/Feb/7/software-factory/#wait-...
This is an interesting point but if I may offer a different perspective:
Assuming 20 working days a month: that's 20k x 12 == 240k a year. So about a fresh grad's TC at FANG.
Now I've worked with many junior to mid-junior level SDEs and sadly 80% does not do a better job than Claude. (I've also worked with staff level SDEs who writes worse code than AI, but they offset that usually with domain knowledge and TL responsibilities)
I do see AI transform software engineering into even more of a pyramid with very few human on top.
Original claim was:
> At that point, outside of FAANG and their salaries, you are spending more on AI than you are on your humans
You say
> Assuming 20 working days a month: that's 20k x 12 == 240k a year. So about a fresh grad's TC at FANG.
So you both are in agreement on that part at least.
Important too, a fully loaded salary costs the company far more than the actual salary that the employee receives. That would tip this balancing point towards 120k salaries, which is well into the realm of non-FAANG
It would depend on the speed of execution, if you can do the same amount of work in 5 days with spending 5k, vs spending a month and 5k on a human the math makes more sense.
You won't know which path has larger long term costs, for a example, what if the AI version costs 10x to run?
If the output is (dis)proportionally larger, the cost trade off might be the right thing to do.
And it might be the tokens will become cheaper.
Tokens will become significantly more expensive in the short term actually. This is not stemming from some sort of anti-AI sentiment. You have two ramps that are going to drive this. 1. Increase demand, linear growth at least but likely this is already exponential. 2. Scaling laws demand, well, more scale.
Future better models will both demand higher compute use AND higher energy. We cannot underestimate the slowness of energy production growth and also the supplies required for simply hooking things up. Some labs are commissioning their own power plants on site, but this is not a true accelerator for power grid growth limits. You're using the same supply chain to build your own power plant.
If inference cost is not dramatically reduced and models don't start meaningfully helping with innovations that make energy production faster and inference/training demand less power, the only way to control demand is to raise prices. Current inference costs, do not pay for training costs. They can probably continue to do that on funding alone, but once the demand curve hits the power production limits, only one thing can slow demand and that's raising the cost of use.
$1,000 is maybe 5$ per workday. I measure my own usage and am on the way to $6,000 for a full year. I'm still at the stage where I like to look at the code I produce, but I do believe we'll head to a state of software development where one day we won't need to.
Maybe read that quote again. The figure is 1000 per day
The quote is if you haven't spent $1000 per dev today
which sounds more like if you haven't reached this point you don't have enough experience yet, keep going
At least that's how I read the quote
Scroll further down (specifically to the section titled "Wait, $1,000/day per engineer?"). The quote in the quoted article (so from the original source in factory.strongdm.ai) could potentially be read either way, but Simon Willison (the direct link) absolutely is interpreting it as $1000/dev/day. I also think $1000/dev/day is the intended meaning in the strongdm article.