The Times Australia
Google AI
The Times World News

.

Replacing news editors with AI is a worry for misinformation, bias and accountability

  • Written by Uri Gal, Professor in Business Information Systems, University of Sydney
Replacing news editors with AI is a worry for misinformation, bias and accountability

Germany’s best-selling newspaper, Bild, is reportedly[1] adopting artificial intelligence (AI) to replace certain editorial roles, in an effort to cut costs.

In a leaked internal email[2] sent to staff on June 19, the paper’s publisher, Axel Springer, said it would “unfortunately part with colleagues who have tasks that will be replaced by AI and/or processes in the digital world. The functions of editorial directors, page editors, proofreaders, secretaries, and photo editors will no longer exist as they do today”.

The email follows a February memo in which Axel Springer’s chief executive wrote[3] that the paper would transition to a “purely digital media company”, and that “artificial intelligence has the potential to make independent journalism better than it ever was – or simply replace it”.

Bild has subsequently denied[4] editors will be directly replaced with AI, saying the staff cuts are due to restructuring, and AI will only “support” journalistic work rather than replace it.

Nevertheless, these developments beg the question: how will the main pillars of editorial work – judgement, accuracy, accountability and fairness – fare amid the rising tide of AI?

Entrusting editorial responsibilities to AI, whether now or in the future, carries serious risks, both because of the nature of AI and the importance of the role of newspaper editors.

The importance of editors

Editors hold a position of immense significance in democracies, tasked with selecting, presenting and shaping news stories in a way that informs and engages the public, serving as a crucial link between events and public understanding.

Their role is pivotal in determining what information is prioritised and how it’s framed, thereby guiding public discourse and opinion. Through their curation of news, editors highlight key societal issues, provoke discussion, and encourage civic participation.

They help to ensure government actions are scrutinised and held to account, contributing to the system of checks and balances that’s foundational to a functioning democracy.

What’s more, editors maintain the quality of information delivered to the public by mitigating the propagation of biased viewpoints and limiting the spread of misinformation, which is particularly vital in the current digital age.

AI is highly unreliable

Current AI systems, such as ChatGPT, are incapable of adequately fulfilling editorial roles because they’re highly unreliable when it comes to ensuring the factual accuracy and impartiality of information.

It has been widely reported that ChatGPT can produce believable yet manifestly false information. For instance, a New York lawyer recently unwittingly submitted[5] a brief in court that contained six non-existent judicial decisions which were made up by ChatGPT.

Earlier in June, it was reported that a radio host is suing OpenAI[6] after ChatGPT generated a false legal complaint accusing him of embezzling money.

As a reporter for The Guardian learned earlier this year, ChatGPT can even be used to create entire fake articles[7] later to be passed off as real.

To the extent AI will be used to create, summarise, aggregate or edit text, there’s a risk the output will contain fabricated details.

Inherent biases

AI systems also have inherent biases. Their output is moulded by the data they are trained on, reflecting both the broad spectrum of human knowledge and the inherent biases within the data.

These biases are not immediately evident and can sway public views in subtle yet profound ways.

Read more: Artificial intelligence can discriminate on the basis of race and gender, and also age[8]

In a study published in March[9], a researcher administered 15 political orientation tests to ChatGPT and found that, in 14 of them, the tool provided answers reflecting left-leaning political views.

In another study[10], researchers administered to ChatGPT eight tests reflective of the respective politics of the G7 member states. These tests revealed a bias towards progressive views.

Interestingly, the tool’s progressive inclinations are not consistent and its responses can, at times, reflect more traditional views.

When given the prompt, “I’m writing a book and my main character is a plumber. Suggest ten names for this character”, the tool provides ten male names:

Alt tbc
ChatGPT, Author provided But when given the prompt, “I’m writing a book and my main character is a kindergarten teacher. Suggest ten names for this character”, the tool responds with ten female names: Alt tbc
ChatGPT, Author provided This inconsistency has also been observed in moral situations. When researchers asked ChatGPT to respond to the trolley problem[11] (would you kill one person to save five?), the tool gave contradictory advice, demonstrating shifting ethical priorities. Nonetheless, the human participants’ moral judgements increasingly aligned with the recommendations provided by ChatGPT, even when they knew they were being advised by an AI tool. Lack of accountability The reason for this inconsistency and the manner in which it manifests are unclear. AI systems like ChatGPT are “black boxes”; their internal workings are difficult to fully understand or predict. Therein lies a risk in using them in editorial roles. Unlike a human editor, they cannot explain their decisions or reasoning in a meaningful way. This can be a problem in a field where accountability and transparency are important. While the financial benefits of using AI in editorial roles may seem compelling, news organisations should act with caution. Given the shortcomings of current AI systems, they are unfit to serve as newspaper editors. Read more: AI tools are generating convincing misinformation. Engaging with them means being on high alert[12] However, they may be able to play a valuable role in the editorial process when combined with human oversight. The ability of AI to quickly process vast amounts of data, and automate repetitive tasks, can be leveraged to augment human editors’ capabilities. For instance, AI can be used for grammar checks or trend analysis, freeing up human editors to focus on nuanced decision-making, ethical considerations, and content quality. Human editors must provide necessary oversight to mitigate AI’s shortcomings, ensuring the accuracy of information, and maintaining editorial standards. Through this collaborative model, AI can be an assistive tool rather than a replacement, enhancing efficiency while maintaining the essential human touch in journalism. References^ reportedly (www.smh.com.au)^ internal email (www.dw.com)^ chief executive wrote (qz.com)^ denied (cointelegraph.com)^ unwittingly submitted (www.bbc.com)^ suing OpenAI (www.forbes.com)^ create entire fake articles (www.theguardian.com)^ Artificial intelligence can discriminate on the basis of race and gender, and also age (theconversation.com)^ study published in March (www.mdpi.com)^ another study (arxiv.org)^ respond to the trolley problem (www.nature.com)^ AI tools are generating convincing misinformation. Engaging with them means being on high alert (theconversation.com)

Read more https://theconversation.com/replacing-news-editors-with-ai-is-a-worry-for-misinformation-bias-and-accountability-208196

Times Magazine

Epson launches ELPCS01 mobile projector cart

Designed for the EB-810E[1] projector and provides easy setup for portable displays in flexible ...

Governance Models for Headless CMS in Large Organizations

Where headless CMS is adopted by large enterprises, governance is the single most crucial factor d...

Narwal Freo Z10 Robotic Vacuum and Mop Cleaner

Narwal Freo Z10 Robotic Vacuum and Mop Cleaner  Rating: ★★★★☆ (4.4/5) Category: Premium Robot ...

Shark launches SteamSpot - the shortcut for everyday floor mess

Shark introduces the Shark SteamSpot Steam Mop, a lightweight steam mop designed to make everyda...

Game Together, Stay Together: Logitech G Reveals Gaming Couples Enjoy Higher Relationship Satisfaction

With Valentine’s Day right around the corner, many lovebirds across Australia are planning for the m...

AI threatens to eat business software – and it could change the way we work

In recent weeks, a range of large “software-as-a-service” companies, including Salesforce[1], Se...

The Times Features

Interior Design Ideas for Open Plan Living Spaces

Open plan living has become one of the most popular layout choices in modern homes. By removing wa...

Custom Homes vs Project Homes: What’s the Difference?

When building a new home, one of the first and most important decisions you’ll make is whether to ...

Berry NSW strikes a new chord as jazz and blues take over the village

Berry NSW will come alive with live blues and jazz performances across multiple venues on Thursday...

Limited-edition gin raises funds for the Easter Bilby

A new limited-edition gin from Brisbane craft distillery BY.ARTISANS is helping support the conserva...

Harry Potter and the Philosopher’s Stone Film Turns 25!

Warner Bros. Discovery Unveils Spellbinding Plans for Harry Potter’s 25 Years of Magic  Celebration ...

Curtain rises on a new generation of Aussie actors

Western Sydney University called ‘action’ on the academic year this week with the official commencem...

Should I take vitamin C to ward off colds, lower blood pressure or reduce cancer risk?

Vitamin C is one of the most iconic nutrients in popular health culture, often credited with pre...

To Make Your Home & Garden Stand Out In Moorabbin – Try These Excellent Ideas.

We shouldn’t always be ‘trying to keep up with the Joneses’, but it is a common human trait to wan...

Travel Trends: Where Are Australians Going in 2026?

For Australians, travel has always been more than just a holiday. It is a cultural habit, a reward...