Exactly. At this level you don't just put out a statement of your personal opinion. This is run through PR and coordinated with the investors. Otherwise the CEO finds himself on the street by tomorrow. Whatever their motives are, it is aligned with VC, because if it is not then the next day there is another CEO. As the parent stated, this is not cynicism. I see this just rather factual, it is simply the laws of money.
I am suspicious the whole thing is a PR stunt to build public trust.
In none of their statements do they say they won't do the things:
> we cannot in good conscience accede to their request.
That's very specifically worded to not say "under no circumstances will we do this".
> Two such use cases have never been included in our contracts with the Department of War, and we believe they should not be included now
Is not saying they won't eventually be included.
They've left themselves a backtrack, and with the care there this statement has been crafted, that's surely deliberate.
This. This is a public misdirection. They already signed a new deal. It may be to their disliking but nothing in the statement prevents them from moving forward.
That is speculation. You might be correct but this statement could simply be a strong signal to the administration to back down. A hail Mary.
Isn't that what we're all doing in this thread? We could certainly take the document at face value but as a parent commenter said, almost every company starts off with "don't be evil" then goes and does evil things.
Is anthropic different? Maybe. But personally I don't see any indication to give them the benefit of the doubt.
> ... to back down.
Or else what?
> They've left themselves a backtrack, and with the care there this statement has been crafted, that's surely deliberate.
What's worse, someone in their PR department will read this thread and be disappointed that the spin didn't work.
I mean that’s just adulthood.
There are outcomes where the US government seizes the company. Not super likely, not impossible.
It would be naive to write a statement that a future event will never happen, under any circumstances. People who make that mistake get lambasted for hypocrisy when unforeseen circumstances arise.
I see recognition that making absolute statements about the future is best left to zealots and prophets. Which to me speaks of maturity, not duplicity.
> There are outcomes where the US government seizes the company. Not super likely, not impossible.
Are there historical examples in the US specifically where we've nationalized a business?
Because we've certainly invaded countries and assassinated leaders over exactly the same.
ETA: I could have answered my own question with two minutes of research. Yes, we have: https://thenextsystem.org/history-of-nationalization-in-the-...
I'm not sure why you are getting downvoted.
It is indeed a naive, or more likely a dishonest thing to do.
Anyone can promise anything. When there's little to no accountability and public memory/opinion doesn't last a week (or is easily manipulated anyway), then promises mean literally nothing. Very like how, in politics, temporary means permanent.
Or HackerNews itself, with them implementing a little Big Brother. It will, of course, absolutely and without a doubt only "nudge" people and it will absolutely, under no circumtances, pinky promise, never get any worse or do anything else but that.
When there's millions of fools, then those, who actually recognize that they are being fooled, are rarely ever significant in numbers. They're drowned out by the fools, until said fools "wake up" and cry "if only we had known!".
Well ... you could have known, but in your mindlessness you didn't listen and think.
"It must be true, because they say so. D'uh. What are you, dumb?"
This. I don't get why you are getting downvoted. The statement literally says:
Last word is very important: "now".I'm not saying whether or not they're planning to back down, but this sentence doesn't imply that. The "now" is clearly meant to be in reference to the fact they've not in the past.
Being a tech forum centered around VC funding means we have a TON of tech bros (derogatory) here, who believe in nothing beyond getting their own piles of money for doing literally anything they can be paid to do. If you offered these guys $20 to murder a grandmother they'd ask if they have to cover the cost of the murder weapon or if that's provided.
I get it to a degree, people gotta eat, and especially right now the market is awful and, not to mention, most hyperscaler businesses have been psychologically obliterating people for a decade or more at this point. Why not graduate to doing it with weapons of war too? But, personally, I sleep better at night knowing nothing I've made is helping guide missiles into school busses but that's just me.
[dead]
I share this sentiment.
In general - I don’t know if it’s a coincidence but here on HN for example, I’ve noticed an increasing amount of comments and posts emphasizing the narrative of how “well- intended” Anthropic is.
Feel free to judge them by their actions rather than intentions. This situation being an example.
I'd love to see the financial model that offsets losing your single biggest customer and substantial chunk of your annual revenue with some vague notion of public trust.
This is so short sighted. We are so early into this AI revolution, and this administration is obviously in a tailspin, with the only folk left in charge being the least capable ones we have seen in a decade
Imagine what the conversation would be like if Mattis, a highly decorated and respected leader were still the SecDef. Instead we are seeing bully tactics from a failed cable news pundit who has neither earned nor deserved any respect from the military he represents.
We are two elections and a major health issue away from a complete change of course.
But short sightedness is the name of the quarterly reporting game, so who knows.
> We are so early into this AI revolution…
I keep hoping it’s almost over.
Not trying to be the Luddite. Had multiple questions to AI tools yesterday, and let Claude/Zed do some boilerplate code/pattern rewriting.
I’ve worked in software for 35 years. I’ve seen many new “disruptive” movements come and go (open source, objects, functional, services, containers, aspects, blockchains, etc). I chose to participate in some and not in others. And whether I made the wrong choices or not, I always felt like I could get a clear enough picture of where the bandwagon was going that I could jump in, or hold back, or kind of. My choices weren’t always the same as others, so it’s not like it was obvious to everyone. But the signal felt more deterministic.
With LLM/agents, I find I feel the most unease and uncertainty with how much to lean in, and in what ways to lean in, than I ever have before. A sort of enthusiasm paralysis that is new.
Perhaps it’s just my age.
Didn't we go through this same kind of uncertainty with PCs, the internet, and smartphones? It's early and we're all noodling around.
I'm seriously worried there won't be more elections. Not hyperbole at all.
> I'm seriously worried there won't be more elections. Not hyperbole at all.
Why? That's a an unrealistic fear, driven by the insanely overwrought political rhetoric of 2026. Think about it: elections will be the absolute last thing to go.
If you want something to worry about, worry about this:
> And the stakes of politics are almost always incredibly high. I think they happen to be higher now. And I do think a lot of what is happening in terms of the structure of the system itself is dangerous. I think that the hour is late in many ways. My view is that a lot of people who embrace alarm don’t embrace what I think obviously follows from that alarm, which is the willingness to make strategic and political decisions you find personally discomfiting, even though they are obviously more likely to help you win.
> Taking political positions that’ll make it more likely to win Senate seats in Kansas and Ohio and Missouri. Trying to open your coalition to people you didn’t want it open to before. Running pro-life Democrats.
> And one of my biggest frustrations with many people whose politics I otherwise share is the unwillingness to match the seriousness of your politics to the seriousness of your alarm. I see a Democratic Party that often just wants to do nothing differently, even though it is failing — failing in the most obvious and consequential ways it can possibly fail. (https://www.nytimes.com/2025/09/18/opinion/interesting-times...)
It's not an unrealistic fear. Trump has been making noises about "taking over elections." Abolishing elections wholesale is very unlikely, sure, but a sham election rigged by a corrupt government? That's standard fare for authoritarians. And there's evidence of voting anomalies in swing states in the 2024 election.
https://www.theguardian.com/us-news/2026/feb/27/trump-voting...
https://electiontruthalliance.org/
Yeah, Russia still has "elections" for all the good that does them.
Trump _says_ lots. Most of it doesn't come true.
FYI, even though you have a new account, you were banned from your first comment and all your comments automatically show up as hidden-by-default to most users.
It's not who votes that counts, but who counts the votes.
(Attributed to Stalin, but likely comes from a despot earlier in the history.)
Authoritarian nations continue to have elections, turnout is near 100%, and Dear Leader wins with 90% of the vote.
I don't think it's crazy to worry that, but elections are run by the states, there are over 100,000 poling places nationally, and people are pissed. On Jan 3, the entire current House of Representatives terms end; Democratic governors will still hold elections, and if there haven't been elections in GOP-led states, they're out of representation. There are so many hurdles in the way of the fascists canceling or heavily interfering in elections, and they're all just so stupid.
WaPo headline “Administration plans to declare emergency to federalize election rules.” https://www.washingtonpost.com/politics/2026/02/26/trump-ele...
Yeah, they can plan whatever they want. No such authority exists, and it must really be emphasized that they're all so stupid.
Stupid and effective are not mutually exclusive.
I do agree with you that no such authority exists, but this administration seems to get away with a lot of things they have no authority to do.
If you think they're pissed now, just wait to see how they react to election interference.
I recently read up on how the House of Representatives renews itself and quite frankly it's one of the most beautiful processes I've seen, completely removing the influence of the prior congress.
Putin crushes every election he has. Of course there would be more elections.
Mattis- the same highly decorated and respected leader that was on the board of directors at Theranos... edit: added Mattis
Their whole strategy is that the lack of a legal moat protecting their product is an existential threat to human life. They are the only moral AI and their competitors must be sanctioned and outlawed. At which point they can transition from AI as commodity to “value” based pricing.
It’s not going to work, but I can’t blame Amodei and friends for trying to make themselves trillionaires.
This is why we should be skeptical of companies that want to tie themselves to the military industrial complex in the first place.
$200M is >2% ARR at the last numbers we got from them, and would take them back... checks notes... literally only a few days of ARR growth.
I'd love to see any evidence that this single biggest customer is provably and irreversibly lost on all levels of scrutiny as a result of this attempt at building public trust.
The rest of the world moves to using you?
It absolutely is a PR stunt. And the media is cheering.
It's absurd.
It's simple: If you do not like working with the military, cancel your contract with the military and pay the penalties.
They are explicitly not doing that.
This effectively is cancelling, isn't it?
You're implying cancelling quietly would be better. But the department would just use a different supplier. This seems like the action someone would take if they cared about the issue.
> If you do not like working with the military, ...
Eh? But they do like to work with the military. How else are you going to "defend the United States and other democracies, and to defeat our autocratic adversaries"?
They want to work with the military, with just two additional guardrails.
[dead]
> it is simply the laws of money
The First Law of Money: Money buys the Law.
To quote Brennan Lee Mulligan, "Laws are threats made by the dominant socioeconomic ethnic group in a given nation."
Certainly pre-democracy, other than the ethnic group bit.
The full[1] quote is:
> “Laws are a threat made by the dominant socioeconomic ethnic group in a given nation. It’s just the promise of violence that’s enacted, and the police are basically an occupying army, you know what I mean?”
...Which is funny, but technically speaking, it's (more or less) a paraphrasing/extrapolation of the very serious political science definition of a state, “a monopoly over the legitimate use of violence in a defined territory”
[1] Minus the last line, which I will allow others to discover for themselves
That's maybe the second law. The first one is: money is always finite.
Look at how Elon Musk behaved. Do you think VC gladly approved what he did with Twitter? They might want to keep chasing quarterly results - but sometimes, like with Zukerberg, they can't. Not enough money. Similar examples with Google rounds or how much more financially backed politician loses rather often to a competitor. Or, if you will, Vladimir Putin's idea that he can buy whatever results he wants - and that guy is a very wealthy person. There are always limits, putting the money law to the second place. We might argue that often the existing money is enough... but in more geopolitical, continuum-curving cases there are other powerful forces.
The Twitter acquisition wasn't funded by venture capital, so your question about VC approval doesn't apply.
If you're using VC as a general term for "investor" (inaccurately), then the answer to your question is that the major investors, such as Larry Ellison and the Saudi monarchy, wanted political control of Twitter, which meant that they did (apparently) approve what Musk did with it.
You're missing the point. It matters little where exactly money to pay for acquisition of Twitter came from. What matters is that nobody expected Twitter to lose employees and users in such numbers. So, whoever gave the money, was still limited in ensuring the results are "fully enough" in line with their wishes. Because money is always finite.
FWIW, I don’t actually know if board of Anthropic has actual power to replace its CEO or if Dario has retained some form of personal super-control shares Zuckerberg style.
At some level of growth, the dynamics between competent founders and shareholders flip. Even if the board could afford to replace a CEO, it might not be worth it.
I'd counter that at this level of capital, if the CEO doesn't well align with the capital, then super-control shares will be overpowered by super-lawyers and if there is need some super-donations. OpenAI was a public interest company...
Not at all. Especially at that level of capital. It’s the equity equivalent of „if you owe a bank a million dollars, you’re in trouble. If you owe a bank a billion dollars, the bank is in trouble”.
Capital is extremely fungible. Typically extremely overleveraged. Lawyers are on the other hand extremely overprotective. They won’t generally risk the destruction of capital, even in slam-dunk cases. Vide WeWork.
This is fundamentally incorrect.
Anthropic has an odd voting structure. While the CEO Dario Amodei holds no super-voting shares, there are special shares controlled by a separate council of trustees who aren't answerable to investors and who have the power to replace the Board. So in practice it comes down to personal relationships.
Surely you mean the laws of shareholder capitalism. There are many things you can do with money, and only some of them are legally backed by rules that ensure absolute shareholder power.