Artificial intelligence is not only here to stay but it’s constantly evolving, leaving the journalism industry pivoting on how they will incorporate these powerful tools into their newsrooms.
While AI is shifting the ways newsrooms operate, the technology is also causing college journalism professors to reevaluate how they incorporate generative tech into their curricula as they train the next generation of journalists.
Jeremy Caplan, director of teaching and learning at the City University of New York, said journalism schools are at the beginning of a “long and important” conversation around AI and reporting.
“As educators we have to figure out where the lines are for when to use AI and when to not,” said Caplan, who is also the founder of the Wonder Tools newsletter. “We can’t ignore or ban it.”
Instead, Caplan believes that “this is the beginning of experimentation, exploration and innovation” with artificial intelligence tools.
Despite conversations around AI growing exponentially since the inception of software like OpenAI, newsrooms have been using the tech for years.
The Associated Press started using AI in 2014 to automate stories on its business news desk, The Washington Post has used a bot to cover elections and sports since 2016 and Bloomberg utilized AI to aid searches on the company’s terminal software.
Now, much of the conversation consists of how newsrooms can become more efficient by delegating more mundane or rote tasks to AI: analyzing data, understanding social media analytics and helping to come up with drafts of possible interview questions, to name a few.
Andrew DeVigal, director of the Agora Journalism Center at the University of Oregon, said he has suspected that some students have used AI before on class projects, with some of them even telling him that they did.
“I think that transparency for when someone is using artificial intelligence is key,” he said.
DeVigal said one boundary he has about AI is if using it takes away from someone’s livelihood. For example, he said it could be ethically used to draft outlines, generate ideas and aid research, if edited by someone.
He draws the line if it infringes on a copyright or if someone is using it to plagiarize.
“Reporters who incorporate AI into their work will have leverage,” DeVigal said. “It will separate us from society.”
Dalia Hashim, research and programs lead for Partnership on AI, said one of the best ways journalism professors can prepare incoming journalists to use AI is to teach what the limitations of the technology are.
“Students should learn AI on a basic level in a way that will build their work up, instead of taking down their craft,” Hashim said.
Tony Elkins, a faculty member at the Poynter Institute, embraces technology but said he believes AI will never be able to report and write the same way a human journalist does.
“Storytelling is so personal, so we have to be thoughtful and knowledgeable as we use it,” Elkins said.
Beyond the ways college professors are thinking about artificial intelligence, the topic dominated the 2023 Online News Association conference in Philadelphia.
Diana Lopez, ONA’s program coordinator, said this is the first year that the conference has flagged sessions and panels discussing AI. There were about 10 AI-related sessions this year, while there were only two at last year’s ONA conference in Los Angeles.
Kirsten Eddy, senior researcher for the Center for News, Technology & Innovation, said journalists must be aware of some of the dangers of using AI, like factual errors, biases in the technology and where AI pulls its information.
However, she said, implementing some AI into journalism work could come with benefits.
“This is not a downfall,” Eddy said. “Using AI can create room for more innovation and creativity.”