Imagine if a gun manufacturer sold a gun that you couldn't use against X or Y country. Private companies imposing such demands on our military should not be respected. Having weapons that can randomly detect a false positive and shut themselves down because they think you are using it wrong is a feature I would never want built in.
I have also been against these terms of services of restricting usage of AI models. It is ridiculous that these private companies get to dictate what I can or can't do with the tools. No other tools work like this. Every other tools is going to be governed by the legal system which the people of the country have established.
It sounds like you think that Anthropic is the first company regulating the use of their product. This is not a novelty whatsoever.
No, but I find it obnoxious as an end user.
Then don’t create a mass surveillance program on Americans and you shouldn’t have to worry about it ;)
Have you not read the Usage Policy that regular people have to follow? For example, you are not allowed to use their API to automatically summarize your blog post and share the link on X as you are not allowed to make posts automatically.
These models will be able to run on a machine in your pocket locally within a few decades.
Taking principled stands should absolutely be respected.
I can respect a stance while simultaneously calling out how much I dislike it.
> Imagine if a gun manufacturer sold a gun that you couldn't use against X or Y country
That kind of happens with F35s that the US sells to its allies.
Only Israel can make software upgrades and changes to their F35.
> Imagine if a gun manufacturer sold a gun that you couldn't use against X or Y country.
The point here, of course, being that Anthropic is very specifically claiming to not be a gun manufacturer, and Hegseth's response is that the DoD (W?) will force anthropic to build guns.