Blogging has always required aggressive titles. My best posts for years all used this "you" or "we" focused framing too. Trying to solve people's biggest problems!
Then I'm not even focused on the content more than I'm scanning through it for signs of AI slop writing so I don't have to waste brainpower consuming that which took no brainpower to produce.
Also unfair perhaps but I think writers in particular, like the author of this post, should be aware enough of the patterns of AI written slop to consciously avoid them nowadays.
It doesn't matter if you used to write like this, the reality is people will question you now if you do.
Even before AI, I think I've seen it used before in self-help books or therapy type stuff. It has always felt like an intellectually lazy attempt at reframing, painting things as black and white in the form of a thought-terminating cliche. "It's not X, it's Y" discounts X entirely, when usually the relationship between X & Y is more nuanced: "X and also Y", "X because Y", etc.
Also if you do want to use "it's not X, it's Y" as a clincher, you better make sure that Y in fact builds on X in some way (which implies that X and Y actually have to be similar enough to be plausibly associated with each other) and Y isn't just some orthogonal concept.
Blogging has always required aggressive titles. My best posts for years all used this "you" or "we" focused framing too. Trying to solve people's biggest problems!
Unfair or not, same thing for me.
Then I'm not even focused on the content more than I'm scanning through it for signs of AI slop writing so I don't have to waste brainpower consuming that which took no brainpower to produce.
Also unfair perhaps but I think writers in particular, like the author of this post, should be aware enough of the patterns of AI written slop to consciously avoid them nowadays.
It doesn't matter if you used to write like this, the reality is people will question you now if you do.
It definitely has a lot of signs of AI writing, but at the same time the flow doesn't really scream AI to me.
Even before AI, I think I've seen it used before in self-help books or therapy type stuff. It has always felt like an intellectually lazy attempt at reframing, painting things as black and white in the form of a thought-terminating cliche. "It's not X, it's Y" discounts X entirely, when usually the relationship between X & Y is more nuanced: "X and also Y", "X because Y", etc.
Also if you do want to use "it's not X, it's Y" as a clincher, you better make sure that Y in fact builds on X in some way (which implies that X and Y actually have to be similar enough to be plausibly associated with each other) and Y isn't just some orthogonal concept.
100%. “It’s not [x]. It’s [y].” is highly overused by ChatGPT in particular. I hope this article isn’t just AI slop, but that’s not a great start.
You're not being unfair. You're showing wisdom.