I don't know if not getting the idea right, but I'm pretty sure people refer to AI outputs as "slop" not due to (only) repetitiveness. According to some sources:

[1] Wikipedia

> AI slop is digital content made with generative artificial intelligence, specifically when perceived to show a lack of effort, quality or deeper meaning, and an overwhelming volume of production.[1][4][5] Coined in the 2020s, the term has a pejorative connotation similar to spam.[4]

[2] Urban dictionary

> Low-quality randomly generated AI content (images, accounts, text, etc) that has been flooding social media sites among other pages.

Yes, I know those may not be the best primary sources, but I'd say the main shared meaning of the word is lack of quality and effort, not repetitiveness itself.

[1] https://en.wikipedia.org/wiki/AI_slop

[2] https://www.urbandictionary.com/define.php?term=AI+slop

Gain-of-function research to create memetic-immune-system-evading AI variants.

> Ethics Statement

> Potential harms include: [...] (ii) attempts to evade AI-text detection.

And it's not clear to me how their mitigations would avoid fooling users (as opposed to algorithmic detection attempts).

Yeah, what this actually achieves if anything is making it harder to quickly recognize slop for what it is, so readers are more likely to give it the benefit of the doubt and keep their eyeballs on it for longer. Which I suppose is desirable if you're in the slop-mongering business (e.g. doing SEO spam or other such methods of flooding the commons with sewage for the sake of profit).

Fits into a broad pattern of deceptive LLM terminology, for example "Deep Research": a humble and honest moniker would me "Reflection" or "Recursive self-prompting".

Yep, and their only reference to the word points to a survey that does not mention slop even once (A survey onllm-generated text detection: Necessity, methods, and future directions. Computational Linguistics, 51(1):275–338, 2025., https://arxiv.org/abs/2310.14724)

That's sloppy (hehe), if you are going to redefine a common word for the first time (i.e. references are not possible) at least do it explicitly.

> I don't know if not getting the idea right, but I'm pretty sure people refer to AI outputs as "slop" not due to (only) repetitiveness. According to some sources:

Yeah, slop is low effort use of AI output ("ChatGPT, write me a blog post about using AI in industry X. Copy. Paste. Publish."). If anything this is should be called Stealthslop, and when slop is harder to detect we'll all waste more time on it.

I came here to say this -- it's going to be slop no matter what.

The LLM erotic roleplaying community's usage of "slop" aligns with the definition in this paper, so it's not without precedent. Several novel sampling methods have originated from that community trying to address this specific issue.

Yup. You see this with the very first projects to get a new sampler being oobabooga text gen webui, sillytavern circa early 2023 with min_p. Same with diffusion models. First projects to get new denoising algorithms are ComfyUI, Automatic1111, etc.

Nothing wrong with that, but at (1) least reference it or (2) define it yourself explicitly.

Honestly "slop" should also be retroactively applied to e.g. Buzzfeed content; it shouldn't just be AI-centric

It isn’t AI centric, it’s derived from poor quality wet food. Often given to pigs or used to describe prison food. It’s the origin of the term ‘sloppy’.

Colloquially it means ‘poor quality’ and always has done. So buzzfeed is journalism slop, just like poor quality AI content is AI slop.

In practice, it's used for anything the speaker doesn't approve of, regardless of quality. When someone uses it, it basically tells me, I don't have anything critical to say, beyond I don't like a thing.