Monday, October 7, 2024
Uncategorized

Is Journalism Under Threat From AI?

It’s certainly possible that AI technologies can undermine or even threaten journalism as a public service, profession, and craft. Between disinformation, job losses, inaccuracies, and biases, at this point the perceived dangers and negative impacts of AI systems known as large language models, seem to far outweigh any potential benefits to the industry. But the greatest threat AI poses in my opinion is that it will take over the creative process.

(Do AIs Dream of Electric Deeps?)

I’m no tech expert and do not use generative AI, but as an experienced journalist, I feel a great urgency to discuss its effect on our profession. I firmly believe in preserving the integrity of journalism by keeping humans in full control of each stage of news gathering, from ideation to fact checking. These may be lofty goals as most media company executives are probably chomping at the bit to start replacing people with bots. Others enjoy the fact that AI can perform all their monotonous tasks, freeing them up to enjoy the fun stuff. And I suppose, it does have its uses.

Whether you love it or hate it, we must closely monitor the use of AI in journalism, because it’s not going away, and is rapidly evolving and improving every day. But I also believe in the human spirit and our inherent love of storytelling. This alone could save the industry by ‘keeping it real.’

Since November 2022, we’ve experienced a huge leap in technology with the introduction of generative AI that can create new output, including text and images, which is often indistinguishable from human-generated content. Although most of us have been using AI in some form for many years, there’s a huge difference between spell checking a piece you wrote and prompting ChatGPT to draft a 1,200-word feature article from scratch that you can then claim as your own and monetize.

Maybe a better question would be, do you prefer consuming content created by artificial intelligence or by a human being? And will there be systems and rules in place to distinguish between them?

The business of journalism has been ‘under threat’ from technological advances for decades, with digitalization drastically changing how news is distributed and consumed. At the same time the digital era has democratized journalism, opening doors for independent and citizen journalists, which is, overall, a positive development.

But digitalization has also decimated local print journalism, which is a huge loss. Citizen journalists and local online ‘news’ channels fill in some of the gaping holes left by the declining newspaper publishing industry, which lost almost 60% of its workforce between 1990 and 2016 according to the US Bureau of Labor Statistics. But they haven’t yet been able to replicate what newspaper reporters and magazine writers had the time, patience, and training to produce—thoughtful, thorough, boots-on-the-ground journalism, relying heavily on primary research and sources.

Since the introduction of ChatGPT in November 2022 the actual creative process of journalism is potentially compromised. And that’s a whole different story.

By now most people are familiar with ChatGPT (Generative Pre-trained Transformer), which has many capabilities—from answering questions and explaining complex topics to creating output such as social media posts, poetry, essays, and complete articles. The responses generated by ChatGPT are based on the data it was trained on, so one of the biggest problems is the bias present during its learning. There are also concerns over false information it may produce and whether compensation should be awarded writers, artists, and musicians whose work was used (without permission) to train the AI models. 

In November 2023, OpenAI introduced an updated version, GPT- 4 Turbo. Other companies have since developed their own models, such as X’s (formerly Twitter) Grok AI. And platforms like Midjourney and Dall-E can generate images in a matter of seconds, including scenes that don’t exist.

AI tools have been around for years and have obvious benefits. For example, I’d be lost without my phone’s GPS. Many writers use Grammarly and transcribing services for interviews. The Associated Press have been using AI in their newsrooms since 2014 and have even made a deal with OpenAI to license their archive of news stories dating back to 1985. They’ve also created a list of guidelines for AI use in newsrooms, which you can read here.

But Chat GPT and its counterparts have pushed the use of AI to another level—creating content. This is where it gets problematic. As an experienced, dedicated researcher and writer, who’s spent years learning and perfecting your skills and craft, would you feel comfortable claiming authorship for a piece you didn’t actually write? And would you fully trust the output of a generative AI model? After all, it would be difficult to cite the source of this information.

What is the Purpose of Journalism?

Journalism is a valuable public service, often referred to as the ‘fourth pillar of democracy’ or the ‘fourth estate.’ According to the American Press Institute the central purpose of journalism is to “provide citizens with accurate and reliable information they need to function in a free society.” https://americanpressassociation.com/principles-of-journalism/

From covering government policies to business dealings, journalism must be disciplined, with the facts presented in a clear, neutral manner, free of opinion and emotion. Being ‘the watchdogs of democracy’ and keeping governments, institutions, and businesses accountable, is a serious responsibility, one we can’t simply hand over to machines. The consequences, including potential conflicts of interest and the use of propaganda, could be quite severe.

AI can be safely used for repetitive tasks like data analysis, outlines, transcribing, and so on. But some people are using ChatGPT and other systems to create first drafts of articles, publishing them unedited, and calling their ‘efforts’ journalistic.

*****

The Writer’s Digest Guide to Journalism is a practical, informative, and well-researched introduction to journalism and its best practices, with actionable advice, tips, techniques, explanations, and anecdotes straight from the field. In this digital guide, writers will learn how to write an effective news piece, skills need to be an effective journalist, outlets for publishing journalism, journalism associations, and so much more. Both inspirational and pragmatic, The Writer’s Digest Guide to Journalism is packed with valuable resources for aspiring journalists.

Click to continue.

*****

The harsh reality is that anyone can legally claim the title journalist, and journalism incorporates many forms, including blogs and personal essays. We’re now inundated with such content, and we can assume that much of it is AI generated. Why spend time learning the craft and doing the hard work when a robot can do it all for you? 

Many articles and blogs are poorly researched and badly written but monetized regardless of quality. Content creators don’t care if they’re being disingenuous. Their goal is making money, and they will continue as long as it’s easy and profitable, ethics and morals be damned. But disinformation can also be easily created and distributed through AI systems, and the technology has made it simple to manipulate images and video footage. Soon it will be hard to distinguish real material from fakes, rendering photographic and video evidence untrustworthy, and therefore useless.

But people will soon tire of this bland content with its inevitable uniformity and start craving, even demanding original, authentic work. This should encourage those of us who refuse to use AI for creative purposes, to continue the hard work of producing real writing and journalism.

I believe the whole journalistic process—finding and researching stories, interviewing, writing the first draft, editing and fact checking—should be AI free. I would even advocate transcribing your own interviews as listening after the fact can spark ideas and often the piece will take shape in your mind. Those with decades of experience don’t need AI, we’ve managed quite well without it, and we’ve mastered the craft because we’ve had to learn all the steps. 

Using AI is basically cheating—cheating yourself out of an education through doing and experience and calling yourself a writer or journalist without doing the important grunt work. And if you decide to use AI to write a piece, even if it’s an outline or a first draft (which is where most of the magic of writing occurs by the way) then this should be declared.

I can’t say this enough: We must clearly label human-created journalism and AI-generated content.

AI is good at mimicking humans and can be prompted to write in anyone’s ‘style,’ but they have not and never will experience life firsthand. AI is incapable of original thought and ideas. It merely recreates and spits out content from the information it has ingested. Only a real person can write a compelling, empathetic, and genuine piece exploring and describing the human condition.

Talking to people in ‘real life’ is the best way to do journalism. AI can only work with digital data and information, it cannot go out into the field and interview victims and witnesses, perform on-scene reporting, connect with people, be embedded, nurture contacts, or provide the ‘human touch’ which is vital when covering sensitive subjects. It’s incapable of finding organic leads and gaining the trust of primary sources. And since, in my view, this is what constitutes real journalism—that is, reporting in the community, on the ground, in the field, with the people—then journalism is safe, and we have nothing to fear. At least not yet.

Sources:

https://www.theverge.com/2023/7/13/23793810/openai-associated-press-ai-modelshttps://gijn.org/stories/10-things-you-should-know-about-ai-in-journalism/https://www.aljazeera.com/opinions/2023/7/19/what-future-for-journalism-in-the-age-of-aihttps://apnews.com/article/openai-chatgpt-associated-press-ap-f86f84c5bcc2f3b98074b38521f5f75ahttps://blog.ap.org/standards-around-generative-aihttps://www.poynter.org/business-work/2023/2023-news-deserts-report-penny-abernathy-medill/https://localnewsinitiative.northwestern.edu/projects/state-of-local-news/explore/#/localnewslandscape