TBH I am sad that Anthropic is changing its stance, but in the current world, if you even care about LLM safety, I feel that this is the right choice — there’s too many model providers and they probably don’t consider safety as high priority as Anthropic. (Yes that might change, they can get pressurized by the govt, yada yada, but they literally created their own company because of AI safety, I do think they actually care for now)
If we need safety, we need Anthropic to be not too far behind (at least for now, before Anthropic possibly becomes evil), and that might mean releasing models that are safer and more steerable than others (even if, unfortunately, they are not 100% up to Anthropic’s goals)
Dogmatism, while great, has its time and place, and with a thousand bad actors in the LLM space, pragmatism wins better.
Do you work at Anthropic, or know people who do?
I genuinly curious why they are so holy to you, when to me I see just another tech company trying to make cash
Edit: Reading some of the linked articles, I can see how Anthropic CEO is refusing to allow their product for warfare (killing humans), which is probably a good thing that resonates with supporting them
Let us not pretend that they won't be used for war eventually. If they cave immediately under pressure, then this is an inevitably.
How is it a good thing to refuse to provide our warfighters with the tools that they need? I mean if we're going to have a military at all then we owe it to them to give them the best possible weapons systems that minimize friendly casualties. And let's not have any specious claims that LLMs are somehow special or uniquely dangerous: the US military has deployed operational fully autonomous weapons systems since the 1970s.
This is the US military we’re talking about so 95% of what they do is attacking people for oil. They don’t “need” more of anything, they’re funded to the tune of a trillion dollars a year, almost as much as every other military in the world combined. What holy mission do you think they’re going to carry out with the assistance of LLMs?
That's a total non sequitur. If you think the military is being tasked with the wrong missions, or too many missions, then take that up with the civilian political leadership. But it's not a valid reason to deny the warfighters the best possible weapons systems.
Personally I favor a less interventionist foreign policy. But that change can only come about through the political process, not by unaccountable corporate employees making arbitrary decisions about how certain products can be used.
> But it's not a valid reason to deny the warfighters the best possible weapons systems.
Of course it is.
Think about it this way: if you could guarantee that the military suffers no human losses when attacking a foreign country, do you think that's going to more or less foreign interventions?
The tools available to the military influence policy, these things are linked.
US military is already overwhelmingly powerful, there's 0 reason to make it even more powerful.
That's so delusional. The US military is currently preparing for a potential conflict with China to stop an invasion with Taiwan. They don't have anything near "overwhelming force" for that mission: recent simulations put it about even at best. People who believe they don't need any improved autonomous weapons are simply uninformed.
Why would the US enter into direct conflict with a nuclear power over a country they aren't even formally allied with?
If the US actually cared they'd formally place Taiwan under nuclear protection.
You are claiming all americans must happily create weapons. Thats a silly statement to most americans and humans
Don't presume to put words in my mouth. I flagged your comment for lying about my claims.
Individual Americans aren't slaves. They can do as they please and are under no obligation to help build weapons for warfighters. But I think it's ridiculous and offensive for a US corporation to presume to take on a role as moral arbiters by placing arbitrary limits on US government use of certain products. There are larger issues here that need to be addressed through the political process, not through commercial software license agreements.
Sure, it wasnt fair for me to claim you said that, so I apologize. It was rude of me to frame my position in that manner, and wasnt intended maliciously.
I meant to suggest that corps being unable to take those positions results in such a world for Americans at those corps
> I think it's ridiculous and offensive for a US corporation to presume to take on a role as moral arbiters
A corporation is just a group of people. Anthropic isn't even public, and therefore it's directors aren't subject to any sort of fiduciary duty enshrined in law. They can collectively act as they wish.
> If you think the military is being tasked with the wrong missions, or too many missions, then take that up with the civilian political leadership. But it's not a valid reason to deny the warfighters the best possible weapons systems.
It is an ethical dilemma: believing an armed force will act unethically is in fact a valid reason to refuse to arm them. You are taking a nationalistic view regarding the worth of life.
And if you believe it is unethical to arm them, it is rational to use whatever leverage you have available to you - such as refusing to sell your company's product.
Furthermore, one of the two points at issue was regarding surveiling civilians.
> that change can only come about through the political process
What, to you, is the political process? Why is wielding your economic leverage to incite change illegitimate to you?
"How is it a good thing to refuse to provide our warfighters with the tools that they need?"
Perhaps you should consider that this is a loaded question. I don't think HN needs this sort of Argumentum ad Passiones.
Why are you asking this question? You know what the answer is, you've just arbitrarily decided that it's specious in an attempt to frame rebuttals as unreasonable.
I'm open to reasonable rebuttals but all the rebuttals that I've seen so far are simply uninformed.
1. You don't believe in the mission or direction of US warfighters 2. Supporting warfighters is developmentally distinct from what you want your corporate competences and direction are. 3. you don't want military to be more safe an capable.
> If we need safety, we need Anthropic to be not too far behind (at least for now, before Anthropic possibly becomes evil)
I don't think it's going to be as easy to tell as you think that they might be becoming evil before it's too late if this doesn't seem to raise any alarm bells to you that this is already their plan
The world would be so much nicer if there were just fewer pragmatists shitting up the place for everyone. We might actually handle half our externalities.