The comment by user senko [1] links to a post from this same author with an example for a specific coding session that costs $15.98 for 8 hours of work. The example in this post talks about leaving agents running overnight, in which case I'd guess "twice that amount" would be a reasonable approximation.
Or if we assume that the OP can only do 4 hours per sitting (mentioned in the other post) and 8 hours of overnight agents then it would come down to $15.98 * 1.5 * 20 = $497,40 a month (without weekends).
>$15.98 * 1.5 * 20 = $497,40 a month
Are people seriously dropping hundreds of dollars a month on these products to get their work done?
I am not really happy with thinking about what this does to small companies, hobbyists, open source programmers and so on, if it becomes a necessity to be competitive.
Especially since so many of those models have just freely ingested a whole bunch of open source software to be able to do what they do.
It's perfectly framed in the loss of local-compute autonomy that seems to be the trend nowadays: https://andrewkelley.me/post/renting-is-for-suckers.html
If it wasn't obvious by now, the big capital doesn't really care about open source, hobby coding or small companies.
If you make 10k/mo -- which is not that much!, $500 is 5% of revenue. All else held equal, if that helps you go 20% faster, it's an absolute no brainer.
The question is.. does it actually help you do that, or do you go 0% faster? Or 5% slower?
Inquiring minds want to know.
>If you make 10k/mo -- which is not that much!,
This is the sort of statement that immediately tells me this forum is disconnected from the real world. ~80% of full time workers in the US make less than $10k a month before tax.
Source: https://dqydj.com/income-percentile-calculator/
10k is more close to a yearly software developer salary in my country than a monthly one.
That being said at least the $20/mo Claude Code subscription is really worth it, and many companies are paying for the AI tools anyways.
And yet, the average salary of an IT worker in the US is somewhere between 104 and 110k. Since we're discussing coders here, and IT workers tend to be at the lower end of that, maybe there is some context you didn't consider?
And yet, the difference between average and median isn't understood.
>And yet, the average salary of an IT worker in the US is somewhere between 104 and 110k.
After tax that's like 8% of your take home pay. I don't know why it's unreasonable to scoff at having to pay that much to get the most out of these tools.
>maybe there is some context you didn't consider?
The context is that the average poster on HN has no idea how hard the real world is as they work really high paying jobs. To make a statement that "$10k a month is not a lot" makes you sound out of touch.
We're talking about people who work really high paying jobs deciding if a tool is worth their time.
Why would anyone discuss whether or not people who don't work those jobs should be using those tools, when that isn't part of their job?
>if that helps you go 20% faster, it's an absolute no brainer.
Another thing--is your job paying you $500 more per month for going 20% faster?