I find some value as kinda a better alexa.

I have it hooked up to my smart home stuff, like my speaker and smart lights and TV, and I've given it various skills to talk to those things.

I can message it "Play my X playlist" or "Give me the gorillaz song I was listening to yesterday"

I can also message it "Download Titanic to my jellyfin server and queue it up", and it'll go straight to the pirate bay.

It having a browser and the ability to run cli tools, and also understand English well enough to know that "Give me some Beatles" means to use its audio skill, means it's a vastly better alexa

It only costs me like $180 a month in API credits (now that they banned using the max plan), so seems okay still.

> It only costs me like $180 a month in API credits (now that they banned using the max plan), so seems okay still.

I have a hard time imagining how much better Alexa would have to be for me to spend $180/month on it...

Just to clarify to people focusing on the $180/month price tag.

OpenClaw is not a CC-only product. You can configure it to use any API endpoint.

Paying $180/month to Anthropic is a personal choice, not a requirement to use OpenClaw.

So that leads to a question: Is there a physical box I could buy that an amortize over 5-7 years to be half the API cost?

In other words, assuming no price increase, 7 years of that pricing is $15k. Is there hardware I could buy for $7k or less that would be able to replace those API calls or alternativr subs entirely?

I've personally been trying to determine if I should buy a new GC on my aging desktop(s), since their graphic cards can't really handle LLMs)

For something like OpenClaw you realistically only need rather slow inference, so use SSD offload as described by adrian_b here: https://news.ycombinator.com/item?id=47832249 Though I'm not sure that the support in the main inference frameworks (and even in the GGUF format itself, at least arguably) is up to the task just yet.

You can't realistically replace a frontier coding model on any local hardware that costs less than a nice house, and even then it's not going to be quite as good.

But if you don't need frontier coding abilities, there are several nice models that you can run on a video card with 24GB to 32GB of VRAM. (So a 5090 or a used 3090.) Try Gemma4 and Qwen3.5 with 4-bit quantization from Unsloth, and look at models in the 20B to 35B range. You can try before you buy if you drop $20 on OpenRouter. I have a setup like this that I built for $2500 last year, before things got expensive, and it's a nice little "home lab."

If you want to go bigger than this, you're looking at an RTX 6000 card, or a Mac Studio with 128GB to 512GB of RAM. These are outside your budget. Or you could look at a Mac Minis, DGX Spark or Strix Halo. These let you bigger models much slower, mostly.

You can buy a roughly $40k gpu (the h100) which will cost $100/mo in electricity on top of that to get about 30-80% the performance of OpenAI or Anthropic frontier models, depending what you're doing.

Over 5 years, that works out to ~$45k vs ~$10k, and during that duration, it's possible better open models will come available making the GPU better, but it's far more likely that the VC-fueled companies advance quicker (since that's been the trend so far).

In other words, the local economics do not work out well at a personal scale at all unless you're _really_ maxing out the GPU at close to 50% literally 24/7, and you're okay accepting worse results.

As long as proprietary models advance as quickly as they are, I think it makes no sense to try and run em locally. You could buy an H100, and suddenly a new model that's too large to run on it could be the state of the art, and suddenly the resale value plummets and it's useless compared to using this new model via APIs or via buying a new $90k GPU with twice the memory or whatever.

This feels like it should be state infrastructure, the way roads, railroads and the postal system are.

This feels like a market which hasn't settled into long-term profitability and is being subsidized by investors.

Note that the (edit: US) postal system is a for-profit system.

Given the trends of the capitalist US government, which constantly cedes more and more power to the private sector, especially google and apple, I assume we'll end up with a state-run model infrastructure as soon as we replace the government with Google, at which point Gemini simply becomes state infrastructure.

> Note that the (edit: US) postal system is a for-profit system.

That's not correct. If USPS makes more revenue than their expenses for a year, they can't pay it out as profits to anyone.

It's true that USPS is intended to be self-funded, covering it's costs through postage and services sold, and not tax revunue. That doesn't mean there's profit anywhere.

> Note that the postal system is a for-profit system.

That depends on the country in question :-)

You can use several times cheaper models than Claude as well, its not like you need anything big to handle all the uses cases listed above

Yeah, something like MiniMax m2.7 should be perfectly capable for this sort of thing, and is 10-20x cheaper

For something the size of Claude, probably not. But for smaller models, maybe (though they also are much cheaper to buy tokens for)

I mean, I'm getting $180/mo worth of fun out of playing with it and figuring out what it can do that it's worth it.

Like, no one bats an eye at all the people paying $100/mo for Hulu + Live TV, or paying $350/mo for virtual pixels in candy crush / pokemon go / whatever, and I'm having at least that much fun in playing with openclaw.

Everyone in my circle would seriously bat an eye at all those numbers. Congrats on making it to the upper class.

Just for reference: I pay 8€ for mobile, 40€ for internet and some occasional 5€ for VPNs each month. That's all the digital service subscriptions I'll need to have fun.

I think quite a lot of people would bat an eyelid at those things.

If any of my friends admitted to spending $350/mo on candy crush i'd think that they'd badly need help for a gambling problem.

I do see how a very busy businessman or a venture capitalist would gladly pay 180$/month to offload chores and mundane work from his schedule. That comes down to 6$/month, which probably matches his monthly coffee budget.

Chores, yes. If there was a $180/month where ALL my families chores could be accomplished, I'd consider it.

That means picking up and cleaning the house after 3 kids and a dog. Grocery shopping. Dishes. Laundry. Chores.

Tech crap? Nope.

I would imagine that the list of digital chores of a very busy businessman are a bit more extensive. Even in your list, groceries is something that becomes digital once you're high enough in income.

My grocery store has offered a pick-up or delivery option ever since COVID. Pick-up actually cost nothing extra. It's been years since we used it so I can't say definitively that it's still free, but the downside wasn't cost: it was the ability to pick the best item. If you let the store choose, you'll get the saddest looking produce every time, and the meat that's set to expire tomorrow.

To each his own.

> I can message it "Play my X playlist"

People do this? Or is it some sort of joke way above my head?

In what bizarre world is it easier to ask a massive LLM to play a playlist rather than ... literally hitting the play key on it?

> It only costs me like $180 a month in API credits

In The Netherlands you can get a live-in au-pair from the Philippines for less than that. She will happily play your Beatles song, download the Titanic movie for you, find your Gorillaz song and even cook and take care of your children.

It's horrible that we have such human exploitation in 2026, but it does put into perspective how much those credits are if you can get a real-life person doing those tasks for less.

I'm surprised to read that. Here in the UK, having a live-in au pair doesn't excuse you from paying the minimum wage for all the hours that they're working (approx $2300/month for a 35 hour week). You can deduct an amount to account for the fact that you're providing accomodation but it's strictly limited (approx $400/month).

The Netherlands has a weird and exploitative setup where you can classify your au pair as a "cultural exchange", and then pay them literal peanuts (room and board plus a token amount of "pocket money")

Another weird cultural quirk of the Dutch that will hopefully go the way of Zwarte Piet one day.

From what I can see online, the average compensation that an au-pair in The Netherlands receives is 300 euro per month, with living expenses being covered by the family. There is no minimum wage requirement for au-pairs like in the UK or the US.

The added cost of having an additional person to provide room and food for way exceeds that €300/month. Especially, when taking into consideration that you might have to extend/renovate the house to lodge another person. Adding an extra bedroom and possibly bathroom is not cheap.

Even if you assume the cost of lodging was 1000€ (which it isn't) then the au-pair would still be significantly underpaid.

A normal full time employee costs at least 2000€ a month (salary, tax, pension plan, health insurance, etc). If you are paying less than that you are definietly exploiting them.

[deleted]

A semi-skilled English-speaking customer service agent in PH makes less than $700 a month to put this into perspective.

Working abroad is a totally reasonable proposition compared to working in the Philippines.

So in reality you’re paying for their food, electricity and heat, letting them rent a room for free, and allowing them the use of the other facilities in your home and on top of that you’re giving them a spending allowance of 300 euro.

The marginal cost of food/electricity/bed for adding one additional person to a family is drastically less than those things would cost for a person living alone. Whichever way you slice this, the employer is making out like a bandit under this scheme.

In fact, you could do this for a homeless person today, in any city on the globe! And never even ask them to do anything for you!

[dead]

We shouldn't have to "import" people from poorer countries to do the mundane tasks we got too lazy to do ourselves.

The concept of having this kind of help is totally foreign to me, but with the exception of one, every family I’ve encountered that had an au pair have been two very busy high earning parents, neither of them lazy. I think you could argue that perhaps priorities have been misplaced, but not lazy.

Machines don't get tired, don't have to sleep, don't face principal-agent problems and can accumulate Skill.md instructions for decades without getting replaced. I definitely see the potential of something like OpenClaw for those who can afford it.

Surely that’s subsidized?

A lot of people in the Silicon Valley area spend that much ($6/day) on coffee. What they don’t realize is how out of touch they are in thinking makes sense for the rest of the fucking world. $180/mo is about 5% of the median US per capita income. It’s not going to pick your kids up from school, do your taxes, fix your car, or do the dishes. It’s going to download movies and call restaurants and play music. It’s a hobby, high-touch leisure assistant that costs a lot of money.

They aren't selling it to the median US earner. They're selling it (and trying to generate FOMO) to the out of touch people so that it becomes so entrenched that the median earner will be forced to use it in some capacity through their interaction with businesses, schools, the government, etc.

Realistically you certainly don’t Anthropic’s models for those things and can get something for a fraction of the price on OpenRouter/etc.

You're paying the au pair partly in accommodation, food, bills and a visa. The visa isn't coming out of your bank account, but it's definitely part of the incentive, so you could see it as a government subsidy.

For comparison, a full time "virtual assistant" with fluent English from the Philippines costs upwards of $700/month nowadays.

How is that remotely possible without committing enormous violations of labor law?

Framed this way - then “replacing” this kind of human exploitation is definitely a good for humanity. If someone doing a job is practically a slave, then replacing them with an electron to token converter is a good thing.

The number one goal of AI should be to eliminate human exploitation. We want robots mining the minerals we use for our phones, not children. We should strive to free all of humanity from dangerous labour and the need for such jobs to exist.

If Elon Musk wants Optimus robots to help colonize Mars shouldn’t he be trying to create robots that can mine cobalt or similar minerals from dangerous mines and such?

> The number one goal of AI should be to eliminate human exploitation.

I have some bad news.

I doubt this is true in .nl. 180 a month is low for a live-in au-pair.

> In The Netherlands you can get a live-in au-pair from the Philippines for less than that.

And you see nothing wrong with that?

I don't want to be judgemental, but I do find it funny that you're paying $180 for this convenience, and use it to pirate movies.

Then allow me to be judgemental in your stead. I've done a similar setup as the above and completely locally. I dunno how they're paying so much, but that's ridiculously overpriced.

All the other models performed much worse for the skills I'm using. I tried gpt-5.1 (and then 5.4 again recently), and also tried pointing it at OpenRouter and using a few of the cheaper models, and all of them added too much friction for me.

Be judgemental all you want, but I feel like I'm paying for less friction, and also more security since my experiments also showed claude to be the least vulnerable to prompt injection attempts.

> models performed much worse for the skills I'm using

Hard to believe unless your are doing something much more complex than the things you listed

It's not the only thing they're doing with it. I mean, the logic is sound - $180 goes into automating bunch of manual processes in personal life, one of which is getting movies, which in some cases involves going out on the high seas.

Let's also point out the $180 is going to a hideously evil AI company which pirated millions of books and movies.

180 grand a month for PA is a lot of money. But I guess each person has its own priority. I mean, I can pay a very fancy gym with that price instead of the shitty popular one I go, which would probably improve my well being much more than asking to play Gorillaz

"a grand" means a thousand (dollars or pounds or whatever). $180k / month really would be a lot of money. I'd be your PA for that!

Am I right to be a little concerned by the phrase "it'll go straight to the pirate bay"?

Not to be a narc or anything, but is OpenClaw liable to just perform illegal acts on your behalf just because it seemed like that's what you meant for it to do?

Seems like the only people using pirate bay in 2026 are "privacy obsessed" rich middle-aged guys.

I think they do it mostly to feel young and edgy.

> Not to be a narc or anything, but is OpenClaw liable to just perform illegal acts on your behalf just because it seemed like that's what you meant for it to do?

There's at least a couple of dozen instances right now, somewhere, getting very close to designing boutique chemical weapons.

180$/month to queue playlists does not “seem okay” at all. We must be living in different worlds.

You're spendin 180 a month on tokens and still refusing to buy media like Titanic?

If you've figured out how to pirate Anthropic's models and enough GPUs to run it for less than my API costs, I'm all ears

While I love the idea of using it for home/personal automation (and it sounds like you've done a good job executing it), this comment makes it seem like avoiding paying for The Titanic is almost as important as having an OpenClaw-driven assistant/automation system.

> "Download Titanic to my jellyfin server and queue it up", and it'll go straight to the pirate bay

You could build up a legitimate collection for much less than $180/mo.

Using OpenClaw for that is nuts. Claude or GPT could just one shot an app for you that does all that and uses 0 tokens once you've built it.

Regarding Alexa, none of those use cases sound that useful to have an ever-present listening device at home, except if one is bedbound or something.

I have the almost same thing using a network connected raspberry-pi and no AI.

[dead]