because we typically want to know the writer of a piece. we want to know where to lay credit.
every book you buy has an author credited. articles in newspapers and magazines have photographer and author attributions.
asking an ai to write you a story does not make you an author. if you ask someone to take a photo for you, you don’t magically get to say “look at this photograph, i’m a photographer.” if you ask someone to bake you a wedding cake, and then claim you baked it, you’re a fraud.
we deserve to know the actual writer.
> want to know the writer of a piece
but you dodged the question i asked - why can't a piece stand on the contents, rather than its pedigree?
Would you care if a writer used a pen name? Does that in any way diminish their works? What about the unknown editors that contributed?
Because you need to do some pre-filtering on where to focus your attention, and you want to make sure the author put some thought into the article without having to analyze it.
Due to LLMs making the cost of publishing “thoughts” extremely low, there’s now an over-supply of content that looks decent on the surface, but in reality the author has probably spent less time on than the reader.
Are we ready so far down I to the LLM denial mindset that we consider an author spending multiple months crafting this to be "worthless" and less investment then your casual reading?
No, I believe this is a great post. It’s awesome. Even more so because it’s AI generated, as it shows what AI can do when given a lot of quality material to work with.
I’m just talking about the general topic about the usefulness of an “this is AI generated” classifier.
Don't we already have these filters in place? I only saw this because it was highly-upvoted on HN, for example - I don't read every new submission. I also read things sent by friends and family, shared by curators I trust, etc.
Of course these systems may eventually break down, but for now they seem to work.
why does it bother you to give attribution? why do you think crediting the writer impacts how the piece stands?
we have pop musicians who produce massive hits under their names and the song writers are still given credit in liner notes and in the tracks details on spotify or wherever.
if it’s created by a bot, id take it even further and say which version of which model actually generated it should be declared. why would anyone be against giving proper attribution?
We like writing because the fact that we can create good writing says something about ourselves. If AI can create writing that surpasses, say, a Tolstoy or George Eliot, that will fundamentally change our self-perception. Is that a good thing or bad thing? Well, let's first cross the bridge of an LLM writing War & Peace and see how we feel.
It's not about pedigree, but context. Without context our most beloved stories are just meaningless ink on paper.
If someone couldn't be bothered to write it, I certainly can't be bothered to read it. I did not bother to read the article involved because the continual piss stain on the images, the website itself, and a few key phrases let me on to the fact that it was all generated.
When you interact with art, you do so to interact with the author and the point they want to make. Writing is something where a skilled writer will be able to make a point tersely and have it stick, knowing where to embellish and where to keep it simple. Every decision in art tells you about the artist. Generative AI may be able to fake the composition process, but the point of composition is it reveals something about the human. All of those are artistic decisions that a machine apparently now "can do", but not with any coherency.
The holder of the reigns of slop is not an artist, this is plain to see because they do not interact or engage with their work on the same level as an artist. The produced slop is not art, because it cannot be engaged with on the same level.
[dead]
[dead]
I’ve said this many times before
AI is just a tool
If you used a fancy auto bake cake machine instead of an oven, you still get to claim that you made the cake.
100 years ago someone would be making the claim that using an oven to make cakes “doesn’t count”
All AI did was raise the bar
It’s quite clear here that the author spent a lot of time on this so he absolutely gets credit as the author
I think there's a distinction.
Imagine if you had an auto cake making machine that decides on its own the best time to make cake. It adds the ingredients, stirs, turns the oven on, and leaves the finished cake on the counter for you.
People start opening bakeries consisting entirely of cakes baked by the automatic machines. The owners of these machines have no idea whether the cakes have a bit too much flour or were slightly over-stirred. In some cases, they haven't even tried the cakes.
Who gets to claim they made the cake?
By contrast, there are others who carefully tune their machines to make sure everything is perfect. They adjust the mixing settings and ingredient proportions. They experiment and iterate. They taste test throughout the process. And what they give to the public tastes every bit as good as a homemade cake.
The first group is creating slop. The second group, I think, is baking. And OP is in the second group.
Replace "oven" with a dish washer or a washing machine for your clothes. Those things do exactly all of this. Yet we still complain about washing clothes and doing the dishes, even though it is far less effort than anything our parents did, or their parents before them.
If you commission a baker to bake you a cake, did you make the cake? What if you added sprinkles on top?
If you commission a baker, another person, with wants and desires of their own, is involved.
If you use an AI, there isn't.
Either way, it's clear that the author (yes, the author) put a lot of work into this by iterating and shaping it to what he wanted, and that's a lot more than sprinkles.
> If you commission a baker, another person, with wants and desires of their own, is involved.
> If you use an AI, there isn't.
What is the functional difference here? You are commissioning (see: prompting) someone (see: an AI) for a piece of work, or artwork or whatever. The output is out of your control; and I don't think the existence or lack thereof of a human on the other end materially matters.
If we had hyper-advanced ovens from The Jetsons where we could type a prompt using a fold-out keyboard and it would magically generate whatever cake we ask of it: did we or did we not bake that cake? And I do not think it is clear the author put a lot of work iterating and shaping it into what he wanted; we have zero insight into that.
I didn't say the difference was functional. If you don't think the presence of a human on the other end matters (materially or not), feel free to continue this conversation with an LLM simulation of me. You can even prompt it so that you logically triumph and convince "me".
I'm asking you to explain what the actual difference is and you're avoiding the question.
If we had a complete black box where you submitted Prompt and out came Thing, and you had zero clue what said black box actually did, could you claim creation over Thing? What does knowing that it's a human vs LLM make materially different in terms of whether or not you created it?
And I - or did I turn this thread over to an LLM already? - am asking you a question in return, whose answer should give you the answer you want.
No please, I also agree with parent poster. Talk to the LLM, cause the human ain't listening.
Eh.
Why would I give him the same credit I would give a writer.
Or why would I give a writer the same credit I would give someone who created the AI prompts and scaffolding to generate this?
Being unhappy about not being able to call oneself an author, ends up betraying a lack of confidence in the work or process.
In the end writer, dancer, actor, whatever - these titles come from their impact.
There will be a different name for this, and eventually there will be something made that is good enough that people will be spell bound. At which point its going to be named something else.
At which point.
Ironically, the story can be read as gesturing in that direction, as it's ostensibly about giving a new title to a particular job.
In general, though, I think part of the mistake people keep making is that they try to imitate what would be value to engage with if a human wrote it, in an attempt to claim the role of an author of a book or whatever. There's likely artforms that are unique to what an LLM can facilitate, but trying to imitate human artforms is going to give you stunted results. The AI is very good at imitating the form but not the substance.
Once we stop trying to generate and pass off AI essays, novels, choose your own adventure stories, and all the other human genres as being human writing, we'll have a chance to figure out actually interesting artistic forms.
Largely, I agree with you. One famous counterpoint about labeling works of arts with the author: The Economist (the magazine) does not add the author to most of their articles.
> because we typically want to know the writer of a piece. we want to know where to lay credit.
Does the average person really do care all the time? Maybe the outlet it comes from as a whole (factuality, political lean) but more rarely the exact author. Many don’t even have the critical skills for any of it and consume whatever content is chosen for them by whatever algorithm is there. We probably should care, I just don’t think a lot of us do.
For me, needing to know that something’s written by AI serves threefold purposes:
1) acknowledging that it might be slop that someone threw together with no effort (important in regards to spam)
2) acknowledging that depending on the model the factuality might be low when it comes to anything niche (though people are wrong too, often enough)
3) mentally preparing myself for AI bullshit slop language, like “It’s not X, it’s Y.”, or just choose not to engage with it (it's the same disgust reaction as when I find a PDF and realize it's just scanned images, not proper text)
In general, unless the goal is either human interaction or a somewhat rare case of wanting to read a specific blog etc., most of the time I don’t categorically care whether something was lovingly created by a human or shoved out by a half baked version of Skynet - only that it’s good enough for whatever metrics I want to evaluate it by. I’m not ashamed of it and maybe that’s why I don’t take an issue with AI generated code either, as long as it’s good enough (sometimes better than what people write, other times quite shit when the models and harnesses are bad).
In Peter Watt's Blindsight, the aliens understand language as spam, a hostile intent to waste their time, and respond by opening fire.
Reading LLM slop without warning makes me see their point of view.
I think there's useful ways to engage with LLM writing, but they are often very different than human writing.
A human writer, a good one, often has ideas that are denser than the words on the page, and close reading is rewarded by helping you unpack the many implications.
With AI writing, there's usually fewer ideas than words, and so it requires a different kind of engagement. Either the human prompter behind it didn't supply enough ideas, or they were noncommittal enough that their very indecision got baked in.
LLMs are very prone to hedging and circling around a point while not saying much of anything. Maybe it is the easiest way to respond to RLHF incentives and corporate-speak training data. Or maybe they're just intrinsically stuck on being unable to find the right next token so they just endlessly spiral around via all of the wrong ones. Either way, there's often a whole lot of cotton candy text that dissolves when you try to look at it more closely.
can't reply to your comment below so i will comment here
> why does it bother you to give attribution? why do you think crediting the writer impacts how the piece stands?
clearly it does to you?
thing is, this is a fool's errand to try to police what people credit when there is zero capability of verification and enforcement
the current social norms still value authorship, so people will just take or omit credit as they see most advantageous, even if it's merely an ego advantage, which it typically is but a proxy for brand building
what will happen if/when the currency of attribution is completely altered? hard to predict
my prediction is that track record will be considerably more important, not less, but human merit will be increasingly seen as irrelevant