AI has its uses but isn’t close to having major role in journalism
Published 6:40 am Sunday, December 17, 2023
RICK PFEIFFER
rick.pfeiffer@niagara-gazette.com
An opinion columnist for a pair of newspapers in Western New York, Jim Shultz chose a symbolic moment to dip his toes into the world of artificial intelligence.
On the first anniversary of the release of the AI tool ChatGPT, Shultz decided to experiment. He wanted to see if AI could replace him and other local newspaper columnists so he asked the program to write a pair of opinion columns, one on potholes and the other on a pizza sauce debate.
“I thought people are not really talking about (AI),” Shultz said. “And I wondered, did (an AI-generated story) seem like good journalism to them?”
The AI story on potholes featured a quote from the city’s mayor, pleading for patience from drivers trying to navigate their “bumpy rides.” It also reported the plight of motorist John Smith, “a Lockport resident” who had to “replace two tires … because of potholes.”
It was a sobering experience for Shultz.
“(The AI program) makes stuff up,” he said flatly. “The quotes from the mayor are made up. John Smith was a completely imaginary figure.
“It hallucinated a fake citizen.”
News organizations across, including CNHI, continue to evaluate what the future of AI in journalism will look like. While AI can increase some efficiencies — research, story ideas — it also requires a traditional hands-on approach and due diligence to confirm information from multiple sources and editing all AI-generated content before publication.
Researcher David Caswell, writing for the Reuters Institute for the Study of Journalism this fall noted “AI has placed journalism at the cusp of significant change,” while also noting, “There are no best practices, textbooks or shortcuts for this yet, only engaging, doing and learning until a viable way forward appears.”
Damage control
Damian Radcliffe, the Carolyn S. Chambers Professor in Journalism at the University of Oregon and a Fellow at Tow Center for Digital Journalism at Columbia University, says AI’s habit to “hallucinate” information shows that the technology is not ready for prime-time journalism.
“We’ve seen some of the failures of AI at Gannett and (other media companies),” Radcliffe said. “It’s pretty clear there’s a lot of rookie mistakes made and it undermines trust in the media.”
In January, the tech website CNET, and a sister publication Bankrate, went into damage control after revealing that they had published dozens of error-riddled articles using AI programs. This week, the CEO of the company that oversees Sports Illustrated was fired after the outlet published stories with fake author names and profile photos generated by AI.
“(AI) has a lot of potential, but it has to be used with diligence,” Radcliffe explained. “You can’t use it for writing whole articles or rewriting press releases. The technology is not foolproof. It can hallucinate and make things up or have factual inaccuracies. It reinforces the need for human editors and editors who are subject experts and can provide appropriate checks and balances.”
Where the current AI technology can deliver benefits in a newsroom by providing relief for reporters from the so-called “low-level tasks,” like filling in data for boilerplate material like sports box scores and financial data.
“This type of reporting has been for decades at the Associated Press and the L.A. Times,” Radcliffe said. “It’s stuff with a formula and the information can be tucked into a boilerplate.”
While Radcliffe said AI technology offers the ability to give local journalists more time to spend on what he called, “the kind of journalism people really want — investigations and enterprise reporting,” he stressed the need to be completely transparent when a newsroom is using AI.
“Make it clear it’s not a human writer,” he said. “Don’t try to trick your audience.”
Deceiving
Shultz, in his experiment, admitted the AI columns could have been deceiving.
“They were pretty well written,” he said. “You can quibble on style, but they weren’t bad.”
The possibility for deception in the use of AI reinforces the need for “guardrails” in its use.
“Guardrails are our bread and butter,” said Dalia Hashim, the AI and Media Integrity Program Lead at AI Tools for Local Newsrooms Database. “Where should AI tools be used and how do you create the right governance for using those tools? You have to be transparent with audiences.”
The AI Tools for Local Newsrooms Database provides information (such as potential applications, expertise needed, and cost) about automated tools (including investigative, content creation and audience engagement tools) that could be useful to journalists.
The group also gathers experts from the media, the AI industry, civil society and academia to develop guidance on the adoption of AI for news organizations.
Like Radcliffe, Hashim suggested that the best use of AI is to “automate tasks and spread (journalism) resources to allow for more coverage.” She also noted that AI can be used to translate stories into additional languages to reach more diverse communities.
The arrival of computers and cell phones in newsrooms in the late 1980s and early 1990s ushered in a technological revolution in how local news was covered. Radcliffe said the arrival of AI is just another step in the evolutionary process.
“I think it’s hard to postulate (the future of AI) even a year from now,” Radcliffe said. “But I think it will just become a structured part of our workflow, like computers and cell phones.”