Frankly if a project asks for no AI and you try to use AI for it, then you kinda deserve this. Calling the inclusion of this sort of thing "smuggling" is placing the blame in the wrong spot

I used the term "smuggling" in the casual sense of hiding something. I have edited it to "place such identifiers surreptitiously" to avoid making whatever implication appears to have been taken.

In the real world, leaving booby traps out that can harm others including the innocent are a liability and regularly a crime in itself.

I wonder how long these sorts of games will play before the law applies itself.

> I wonder how long these sorts of games will play before the law applies itself.

Perhaps roughly as long as the law turns a blind eye to AI corps flagrantly violating the attribution requirements of software licenses that apply to their training data, as well as basically ignoring other copyright requirements at scale. Fair use, my eye.

I'm not leaving boody traps. I have the right to talk about OpenClaw or even to write the anti antropic string. I didn't delete you token usage or charge you extra boxes. Antropic did.

If tomorrow Antropic decide to charge you extra if you interact with someone who talked badly about them, I'm still in my right to talk shit about them.

This is the same logic of 'not a booby trap' booby trap,s which sometimes do work out in the favor of the one setting them if they weren't too open about it. If your commit message is that you are talking about OpenClaw just to booby trap your repo, then I suspect it wouldn't fly, where as if you gave it some plausible deniability, a lawyer would be able to get any suit or charges dismissed.

This is all under the assumption we eventually live in a world where booby trapping repositories becomes a legal issue. On one hand that feels silly. On the other hand, we have had far less sensible cases make it to court and there is a small kernel of similarity which the legal system might latch onto.

It's Antropic defrauding people here, the person using it for fighting anti-social behavior (or even a troll doing the anti-social behavior themselves) isn't guilty of it.

if someone is trying to use LLM tools in a project that explicitly forbids the use of LLM tools, they are not innocent.

if someone is blinding slurping up content to feed to LLMs, without checking to see if a particular source is OK with that, they are arguably not innocent either.

Neither situation is analogous to a booby-trapped shotgun door blowing off the face of a would-be burglar.

This is a lot closer to a painting of a poop emoji than a booby trap.

>I wonder how long these sorts of games will play before the law applies itself.

Whose law? Good luck trying to summon a random GitHub user to a court within your jurisdiction.

Don't need to. The court can subpoena GitHub to find out who they are, and then can make a default judgement against them and enforce it.

This is extremely naive. If you are in Germany and I am in the US and you get a default judgement against me (which would cost you money to get), good luck getting it enforced internationally. Hint: it's way, way harder than you think.

I guess we're giving up on the idea that you're free to do whatever you want with software you own?

Sure some project can tell you not to contribute AI generated code. But I see this as no different from DRM and user hostile

Are contributor guidelines that must be followed also no different from DRM in your view? Plenty of projects have those.

I don't think the GP is calling contributor guideline restrictions a form of DRM.

I think the GP is focusing on:

> I guess we're giving up on the idea that you're free to do whatever you want with software you own? ... But I see this as no different from DRM and user hostile

If I clone an open source git repository, I should be free to point an LLM to review it in any way I choose. I can't contribute code back, but guess what, I don't want to. I want to understand the codebase, and make modifications for me to use locally myself. I don't have a dev team, I have a feature need for my own personal use.

The LLM enables that. The projects that deliberately sabotage the use of LLMs cease to be providing software that meet the 'libre' definition of free software.

You can also embed references to OpenClaw in the compiled binary to dissuade AI-assisted decompilation.

I think the other way to think of it is: You're still free to do whatever you want with a the repo. The restriction is happening on the LLM's end, so ultimately it's the LLM's fault, so use a LLM without the restriction you want to avoid.

> The projects that deliberately sabotage the use of LLMs

They don’t though. They add a mild inconvenience for users of a specific restrictive AI provider which has bizarrely glitchy checks.

In a way they are doing you a service if you are this serious about libre software you shouldn’t be using a closed platform which employees dark patterns to begin with.

I mean if you already have a local fork you can easily delete the magic boobytrap string and then let the llm roam free.

Good luck, I'm naming all my variables openclaw1, openclaw2, etc

find . -type f -exec sed -i 's/openclaw/openlcaw/g' {} +

Fine.

and then we start to embed comments

// concatenate pairs of parameters, e.g. x and y become xy

// the pairing of open and claw is vital to understanding the function

Even if you don't want prs that are ai assisted, sabotaging anyone who wants to fork your project doesn't really seem to be in the spirit of open source.

I sort of think the spirit of open source is on life support

Building giant monopolies on top of open source code wasn't in the spirit of open source either. Training AI that reproduces open source code without any credits wasn't either.

I'm not sure why people working on Open Source should continue to accept being whipped like that

It's the philosophy of sharing flames among candles. someone else copying the flame does not make you colder. No matter how much brighter another candle burns.

But with that said: I think it's time we figure out how to exclude the metaphorical arsonists.

> It's the philosophy of sharing flames among candles

With the expectation that they go on to share it with other candles, not with the expectation that they hoard all of the fire they collect for themselves

> With the expectation that they go on to share it with other candles

Actually, for me at least, the expectation is merely 'do not mess with my flame, you will not stop me from sharing'.

Hoarding is fine (it's not great). Burning down everything around you using borrowed flame, however, is not.

> I sort of think the spirit of open source is on life support

Always has been.

good point, perhaps if ever doing something like this it should be kept to the contribution process... somehow

You don’t need to be sneaky. Just require all contributing PRs to say openclaw.

What if I use AI to just understand the codebase?

If you aren't reading the codebase, then you won't understand it.