Of course, there’s no shame in using tools that are available to us. We’re a tool-using species. We’re just a bunch of stupid monkeys without tools. A lot of what we do is about using tools to free up time to do more interesting things than doing things the tools already do better than us.
Like it or not, people are using LLMs a lot. The output isn’t universally good. It depends on what you ask for and how you criticize what comes back. But the simple reality is that the tools are pretty good these days. And not using them is a bit of a mistake.
You can use LLMs to fix simple grammar and style issues, to fact-check argumentation, and to criticize and identify weaknesses. You can also task LLMs with doing background research, double-checking sources, and more.
I’m not a fan of letting LLMs rewrite my text into something completely different. But when I'm in a hurry or in a business context, I sometimes let LLMs do the heavy lifting for my writing anyway.
Ironically, a good example is this article which makes a few nice points. But it’s also full of grammar and style issues that are easily remedied with LLMs without really affecting the tone or line of argumentation (though IMHO that needs work as well). Clearly, this is not a native speaker. But that’s no excuse these days to publish poorly written text. It's sloppy and doesn't look good. And we have tools that can fix it now.
And yes, LLMS were used to refine this comment. But I wrote the comment.
If the tool does the task for you, then you didnt do the task. I don't keep my food cold, my refrigerator does. I just turned it on. This doesn't matter unless I am for some reason pretending I myself am keeping my food cold somehow, and then that becomes a lie.
When a tool blurs the line between who performed the task, and you take full credit despite being assisted, that is deceitful.
Spell checking helps us all pretend we're better spellers than we are, but we've decided as a society that correct spelling is more important than proving one's knowledge of spelling.
But if you're purportedly a writer, and you're using a tool that writes for you, then I will absolutely discount your writing ability. Maybe one day we will decide that the output is more important than the connection to the person who generated it, but to me, that day has not arrived.
When does a woodworker cease to be one? When he uses a handsaw? A circular saw? A sawmill?
> When a tool blurs the line between who performed the task
Who saws the wood? He who operates the tool, or the tool performing its function? What is the value of agency in a business that, supposedly, sells product? Code authorship isn't like writing, is it? Should it be?
Or is the distinction not in the product, but in the practice? Is the difference in woodworking vs lumber processing?
Or is it about expectation? e.g. when we no longer expect a product to be made by hand due to strong automation in the industry, we prepend terms such as "hand-made" or "artisanal". Are we currently still in the expectation phase of "software is written by hand"?
I have no dog in this race, really. I like writing software, and I like exploring technology. But I'm very confused and have a lot of questions that I have trouble answering. Your comment resonated though, and I'm still curious about how to interpret it all.
There's also the perception of time. How long did it take you to write that email/comment/code? Did you laborious pour over every word, every line, for hours, before you hit send, regardless of if you used an LLM or not. Or did you spend barely five minutes, and just pasted whatever ChatGPT shit out?
That's the real question that people are trying to suss out.
I recommend The Shape of Actions, by Harry Collins for a book length, sociologically adept treatment of this idea.
My books were edited by professional editors. These editors did not have and did not need any special expertise in my topic to do so. So there's a fundamental difference between grammar checkers/spell checkers, and a tool like an LLM that works in a way that subtly or overtly modify or contribute to the ideas themselves.
I don't mind that a doctor wears a white, nicely laundered lab coat with a stethoscope so that he looks like a doctor. I mind when some rando impersonates a doctor. This is why I distrust people who write by asking ChatGPT to generate text for them.
I like the distinction between syntactic tools, like spellcheck, and semantic tools, like AI. The former clearly doesn't impugn the author, the latter does. They seem clearly and fundamentally different to me.
Where do you put the line? What do you do with the ambiguous categories?
Clearly a trucker does not "deliver goods" and a Taxi Driver is not in the business of ferrying passengers - the vehicle does all of that, right?
Writers these days rarely bother with the actual act of writing now that we have typing.
I've rarely heard a musician, but I've heard lots of CDs and they're really quite good - much cheaper than musicians, too.
Is my camera an artist, or is it just plagiarizing the landscape and architecture?
I'm not sure it makes sense to assume the creative act of a person writing to other people, which is fundamentally about a consciousness communicating to others, is anything like delivering goods.
The distinction I pointed out, applied to people producing writing intended for other people to read, seems to give a really clear "line". Syntactic tools, you're still fully producing the writing, semantic tools, you're not. You can find some small amount of blurriness if you really want, like does using a thesaurus count as semantic, but it seems disingenuous to pretend that has even close to the same impact on the authorship of the piece as using AI.