> Anyone who uses AI tools in our editorial workflow is responsible for the accuracy and integrity of the resulting work. This responsibility cannot be transferred to colleagues, editors...

This sounds a direct callout to the incident earlier this year where an apparently sick staff member relied on an AI to reproduce quotes, and it did not. Ars retracted the article and the staffmember was fired.

I have felt very ethically uneasy about this because the person was ill, and I emailed the Ars editorial team directly to express concern re labour conditions, and to note that it is the editorial team's responsibility to do things like check quotes.

Of course it is the journalist's responsibility: when you have a job you do your job by policy (I wonder if this policy existed in writing at the time of the firing?) plus, it is part of the job to be accurate. But I am also a firm believer in responsibility being greater at higher levels. This sounds a direct abrogation of journalistic standards by the Ars editorial team.

> and to note that it is the editorial team's responsibility to do things like check quotes.

Publishing things online for free (as Ars does) is difficult business. I doubt they can realistically afford an "editorial team" which checks quotes. Paying the journalists is expensive enough.

> apparently sick staff member relied on an AI to reproduce quotes

"Apparently sick", you couldn't phrase it more accurately.

Kudos for firing them, the only valid course of action for a publisher.

That's harsh. I feel any situation where someone is ill and required to work (the appearance, which is a labour issue if true), and makes mistakes while sick, should be treated with a little kindness. I worry they were made an example of.

>This sounds a direct abrogation of journalistic standards by the Ars editorial team.

We depended on an ecosystem of news and journalism to keep our polities informed.

However, if that ecosystem is starving it will increasingly fail to live up to its standards and we can expect these failures to impact us increasingly.

I am not defending bad journalists, nor creating an excuse to tolerate such behavior in the future.

I am describing the macro trend we are facing, the failure state we can expect, and asking what happens if nothing grows to replace it.

The NYT earns revenue through games more than journalism and ads. Wikipedia is seeing reduced visitors due to AI summaries, and this leads to lower donations. A review site I used went into a full paywall.

I don't really see how Ars or most other sites will be able to earn revenue and pay salaries in this bot first environment.

>We depended on an ecosystem of news and journalism to keep our polities informed.

If this is true and necessary we might as well skip the middleman and have the news and journalists run the polities.

Ars has a decently pricey direct subscription, doesn't it? With a lot of tech focused features included. Their strategy is probably the best you could set up in this ecosystem.

If it isn't clear from this policy that Ars is run by the advertisers and not the subscribers, I don't know what would make it clear.

Advertisers only care about eyeballs and really bad press; AI increases the first and rarely causes the second.

My more cynical take is that this might be as subscriber driven as it's possible for a news outlet now. Keep an eye on 404 and see if they can resist the gravity of ads, I guess?

I agree with you; what I am noting is that traditional journalism ethics (editors are responsible for fact checking) is explicitly refused by this policy.

They can simultaneously set standards for their staff -- as they should -- and retain professional standards for the more senior staff as well.

To remove responsibility from those more senior and make those more junior the only ones responsible is in any company a serious professional issue. Here it is also specifically contrary to the professional standards in their business area.

I see my parent comment is downvoted. Yet, this is firmly the ethical and professional and traditional stance. I don't believe AI or any random upcoming technology should change this.