Given how easy it is to access and use ChatGPT, what’s stopping a writer from putting an assignment brief into the chatbot, waiting for a story to generate and then submitting that to an editor?
When that question was posed to heads of editorial teams at five different publishers, the simple answer was nothing. None of the editors or heads of content at Bustle Digital Group, Gizmodo, Forbes, Hunker or Trusted Media Brands had explicitly told contributors not to use ChatGPT to the degree of generating a story or had updated their freelance contracts since ChatGPT’s launch.
At least, not yet.
“Well, you just gave me a great idea,” said Beth Tomkiw, TMB’s chief content officer. “It’s so new … but we should be putting that in all of our [contracts]. We have steps in our process to check if anything’s been plagiarized, and that’s going to be an even murkier territory — unless we openly make a statement about that.”
Jill Schildhouse, a freelance writer and editor, said she had not received any communication regarding ChatGPT from the publications she’s worked with. And that’s left her in the dark as to those companies’ policies regarding content generated by artificial intelligence (though, it should be noted, she hasn’t been impressed in her recent experimentation with ChatGPT).
“I don’t know what any of my outlets’ stance is on AI content. How do they know that their freelancers aren’t submitting AI content? Are there plagiarism concerns? What other ethical concerns surround this? Outlets are going to have to come up with some type of guidelines around how it’s used, both internally … [and] with their freelancers,” Schildhouse said.
Alesandra Dubin, another freelance writer and editor, said only one of the roughly 15 publications she’s worked with had sent out a memo regarding ChatGPT. She declined to share which publisher, but Dubin said the company communicated an “explicit policy change, and there can be no usage of AI in contributed work.”
“There are a lot of unanswered — or unknown to me — questions about plagiarism or what’s expected of me under the contracts with the various outlets to which I contribute,” Dubin said. “Outlets really need to put some guidelines in place, and writers need to think long and hard before taking what appears to be the easy path. Because the second you introduce a major error into your content, you’ve lost your readers and that publication is probably never going to work with you again.”
Is the tech even there yet?
If freelancers are using ChatGPT to generate stories, they might actually be creating a path for their own demise.
“If I found out a freelancer was using ChatGPT, why would I pay that freelancer? Because I could just use ChatGPT myself,” said David Ewalt, editor-in-chief of G/O Media’s tech site Gizmodo. “If you were doing that, that’s not what I’m paying you for.”
Marc Lavallee, director of journalism at the Knight Foundation who deals with tech and product investment, agreed that AI technology shouldn’t be doing most of the work when it comes to writing. The technology isn’t that advanced yet, and it has the potential to make writers who use it seem obsolete.
“If the machine did most of the work, you should probably disclose that. But also if the machine did most of the work and you’re disclosing that, then what were you doing there?” Lavallee said.
But Forbes’ chief content officer Randall Lane and Eve Epstein, svp and gm at Leaf Group’s Hunker, both said they thought it was too early to have guidelines on the use of ChatGPT in place, especially since the technology isn’t advanced enough to generate a story worthy of submitting to an editor.
However, Epstein said heads of editorial teams do need to be thinking about how to communicate best practices with staff and freelancers alike. “I think it would be irresponsible not to be thinking about this stuff,” she said.
“Best practices emerge out of a lot of practice,” Lavallee added. “Right now is the time to be doing the practice to understand what those best practices are for organizations.”
‘The ethics of AI in the newsroom’
At the end of the day, transparency between a contributor and editor is key — and the arrival of ChatGPT doesn’t change that.
“I would think that the expectations should be clear with a freelancer — any work they submit would have to be their own original work, which I don’t think is too dissimilar from what the existing expectations are,” Joseph Lichterman, head of editorial and communications at The Lenfest Institute for Journalism, said in an email.
But there’s also a spectrum when it comes to ChatGPT use, Lavallee said. For example, using it to generate a headline is different from using it to generate the bones of a story. So the question becomes, to what degree is a freelancer’s use of ChatGPT acceptable, and at what point have they gone too far?
“One of the things that is tricky now and is going to continue to get trickier is trying to establish thresholds,” Lavallee said.
Alex Mahadevan, director of MediaWise at The Poynter Institute, said this “gray area” that comes with new technology is something to be concerned about.
“In the world of journalism and transparency for readers, there really should be no gray areas. It should be very, very clear how the news gathering process works,” he said. “Let’s get a handle on the ethics of AI in the newsroom before we go wild with it.”
More in Media
Media Briefing: Publishers’ Q4 programmatic ad businesses are in limbo
This week’s Media Briefing looks at how publishers in the U.S. and Europe have seen programmatic ad sales on the open market slow in the fourth quarter while they’ve picked up in the private marketplace.
How the European and U.S. publishing landscapes compare and contrast
Publishing executives compared and contrasted the European and U.S. media landscapes and the challenges facing publishers in both regions.
Media Briefing: Publishers’ Q3 earnings show revenue upticks despite election ad pullback
Q3 was a mixed bag for publishers, with some blaming the U.S. presidential election for an ad-spend pullback.