> It only costs me like $180 a month in API credits (now that they banned using the max plan), so seems okay still.

I have a hard time imagining how much better Alexa would have to be for me to spend $180/month on it...

Just to clarify to people focusing on the $180/month price tag.

OpenClaw is not a CC-only product. You can configure it to use any API endpoint.

Paying $180/month to Anthropic is a personal choice, not a requirement to use OpenClaw.

So that leads to a question: Is there a physical box I could buy that an amortize over 5-7 years to be half the API cost?

In other words, assuming no price increase, 7 years of that pricing is $15k. Is there hardware I could buy for $7k or less that would be able to replace those API calls or alternativr subs entirely?

I've personally been trying to determine if I should buy a new GC on my aging desktop(s), since their graphic cards can't really handle LLMs)

For something like OpenClaw you realistically only need rather slow inference, so use SSD offload as described by adrian_b here: https://news.ycombinator.com/item?id=47832249 Though I'm not sure that the support in the main inference frameworks (and even in the GGUF format itself, at least arguably) is up to the task just yet.

You can't realistically replace a frontier coding model on any local hardware that costs less than a nice house, and even then it's not going to be quite as good.

But if you don't need frontier coding abilities, there are several nice models that you can run on a video card with 24GB to 32GB of VRAM. (So a 5090 or a used 3090.) Try Gemma4 and Qwen3.5 with 4-bit quantization from Unsloth, and look at models in the 20B to 35B range. You can try before you buy if you drop $20 on OpenRouter. I have a setup like this that I built for $2500 last year, before things got expensive, and it's a nice little "home lab."

If you want to go bigger than this, you're looking at an RTX 6000 card, or a Mac Studio with 128GB to 512GB of RAM. These are outside your budget. Or you could look at a Mac Minis, DGX Spark or Strix Halo. These let you bigger models much slower, mostly.

You can buy a roughly $40k gpu (the h100) which will cost $100/mo in electricity on top of that to get about 30-80% the performance of OpenAI or Anthropic frontier models, depending what you're doing.

Over 5 years, that works out to ~$45k vs ~$10k, and during that duration, it's possible better open models will come available making the GPU better, but it's far more likely that the VC-fueled companies advance quicker (since that's been the trend so far).

In other words, the local economics do not work out well at a personal scale at all unless you're _really_ maxing out the GPU at close to 50% literally 24/7, and you're okay accepting worse results.

As long as proprietary models advance as quickly as they are, I think it makes no sense to try and run em locally. You could buy an H100, and suddenly a new model that's too large to run on it could be the state of the art, and suddenly the resale value plummets and it's useless compared to using this new model via APIs or via buying a new $90k GPU with twice the memory or whatever.

This feels like it should be state infrastructure, the way roads, railroads and the postal system are.

This feels like a market which hasn't settled into long-term profitability and is being subsidized by investors.

Note that the (edit: US) postal system is a for-profit system.

Given the trends of the capitalist US government, which constantly cedes more and more power to the private sector, especially google and apple, I assume we'll end up with a state-run model infrastructure as soon as we replace the government with Google, at which point Gemini simply becomes state infrastructure.

> Note that the (edit: US) postal system is a for-profit system.

That's not correct. If USPS makes more revenue than their expenses for a year, they can't pay it out as profits to anyone.

It's true that USPS is intended to be self-funded, covering it's costs through postage and services sold, and not tax revunue. That doesn't mean there's profit anywhere.

> Note that the postal system is a for-profit system.

That depends on the country in question :-)

You can use several times cheaper models than Claude as well, its not like you need anything big to handle all the uses cases listed above

Yeah, something like MiniMax m2.7 should be perfectly capable for this sort of thing, and is 10-20x cheaper

For something the size of Claude, probably not. But for smaller models, maybe (though they also are much cheaper to buy tokens for)

I mean, I'm getting $180/mo worth of fun out of playing with it and figuring out what it can do that it's worth it.

Like, no one bats an eye at all the people paying $100/mo for Hulu + Live TV, or paying $350/mo for virtual pixels in candy crush / pokemon go / whatever, and I'm having at least that much fun in playing with openclaw.

Everyone in my circle would seriously bat an eye at all those numbers. Congrats on making it to the upper class.

Just for reference: I pay 8€ for mobile, 40€ for internet and some occasional 5€ for VPNs each month. That's all the digital service subscriptions I'll need to have fun.

I think quite a lot of people would bat an eyelid at those things.

If any of my friends admitted to spending $350/mo on candy crush i'd think that they'd badly need help for a gambling problem.

I do see how a very busy businessman or a venture capitalist would gladly pay 180$/month to offload chores and mundane work from his schedule. That comes down to 6$/month, which probably matches his monthly coffee budget.

Chores, yes. If there was a $180/month where ALL my families chores could be accomplished, I'd consider it.

That means picking up and cleaning the house after 3 kids and a dog. Grocery shopping. Dishes. Laundry. Chores.

Tech crap? Nope.

I would imagine that the list of digital chores of a very busy businessman are a bit more extensive. Even in your list, groceries is something that becomes digital once you're high enough in income.

My grocery store has offered a pick-up or delivery option ever since COVID. Pick-up actually cost nothing extra. It's been years since we used it so I can't say definitively that it's still free, but the downside wasn't cost: it was the ability to pick the best item. If you let the store choose, you'll get the saddest looking produce every time, and the meat that's set to expire tomorrow.

To each his own.