Google AI
The Times Australia
The Times World News

.

An AI-driven influence operation is spreading pro-China propaganda across YouTube

  • Written by David Tuffley, Senior Lecturer in Applied Ethics & CyberSecurity, Griffith University
An AI-driven influence operation is spreading pro-China propaganda across YouTube

A recent investigation from the Australian Strategic Policy Institute (ASPI) has revealed[1] an extensive network of YouTube channels promoting pro-Chinese and anti-US public opinion in the English-speaking world.

The operation is well-coordinated, using generative AI[2] to rapidly produce and publish content, while deftly exploiting YouTube’s algorithmic recommendation system.

How big is the network?

Operation “Shadow Play[3]” involves a network of at least 30 YouTube channels with about 730,000 subscribers. At the time of writing this article the channels had some 4,500 videos between them, with about 120 million views.

According to ASPI[4], the channels gained audiences by using AI algorithms to cross-promote each other’s content, thereby boosting visibility. This is concerning as it allows state messaging to cross borders with plausible deniability[5].

The network of videos also featured an AI avatar created by British artificial intelligence company Synthesia, according to the report[6], as well as other AI-generated entities and voiceovers.

While it’s not clear who is behind the operation, investigators say the controller is likely Mandarin-speaking. After profiling the behaviour, they concluded it doesn’t match that of any known state actor in the business of online influence operations. Instead, they suggest it might be a commercial entity operating under some degree of state direction.

These findings double as the latest evidence that advanced influence operations are evolving faster than defensive measures.

Influencer conflicts of interest

One clear parallel between the Shadow Play operation and other influence campaigns is the use of coordinated networks of inauthentic social media accounts, and pages amplifying the messaging.

For example, in 2020 Facebook took down[7] a network of more than 300 Facebook accounts, pages and Instagram accounts that were being run from China and posting content about the US election and COVID pandemic. As was the case with Shadow Play, these assets worked together to spread content and make it appear more popular than it was.

Read more: Scams, deepfake porn and romance bots: advanced AI is exciting, but incredibly dangerous in criminals' hands[8]

Is current legislation strong enough?

The current disclosure requirements around sponsored content have some glaring gaps when it comes to addressing cross-border influence campaigns. Most Australian consumer protection[9] and advertising regulation[10] focuses on commercial sponsorships rather than geopolitical conflicts of interest.

Platforms such as YouTube prohibit[11] deceptive practices in their stated rules. However, identifying and enforcing violations is difficult with foreign state-affiliated accounts that conceal who is pulling their strings.

Determining what is propaganda, as opposed to free speech, raises difficult ethical questions around censorship[12] and political opinions. Ideally, transparency measures shouldn’t unduly restrict protected[13] speech. But viewers still deserve to understand an influencer’s incentives and potential biases.

Possible measures could include clear disclosures when content is affiliated directly or indirectly with a foreign government, as well as making affiliation and location data more visible on channels.

How to spot deceptive content?

As technologies become more sophisticated, it’s becoming harder to discern what agenda or conflict of interest may be shaping the content of a video.

Discerning viewers can gain some insight by looking into the creator(s) behind the content. Do they provide information on who they are, where they’re based and their background? A lack of clarity may signal an attempt to obscure their identity.

You can also assess the tone and goal of the content. Does it seem to be driven by a specific ideological argument? What is the poster’s ultimate aim: are they just trying to get clicks, or are they persuading you into believing their viewpoint?

Check for credibility signals, such as what other established sources say about this creator or their claims. When something seems dubious, rely on authoritative journalists and fact-checkers.

And make sure not to consume too much content from any single creator. Get your information from reliable sources across the political spectrum so you can take an informed stance.

The bigger picture

The advancement of AI could exponentially amplify[14] the reach and precision of coordinated influence operations if ethical safeguards aren’t implemented. At its most extreme, the unrestricted spread of AI propaganda[15] could undermine truth and manipulate real-world events.

Propaganda campaigns may not stop at trying to shape narratives and opinions. They could also be used to generate[16] hyper-realistic[17] text, audio and image content aimed at radicalising individuals. This could greatly destabilise our societies.

We’re already seeing the precursors[18] of what could become AI psy-ops[19] with the ability to spoof identities, surveil citizens en masse, and automate disinformation production.

Without applying an ethics or oversight framework[20] to content moderation[21] and recommendation algorithms, social platforms could effectively act as misinformation mega-amplifiers optimised for watch-time, regardless of the consequences.

Over time, this may erode social cohesion, upend elections, incite violence and even undermine[22] our democratic institutions. And unless we move quickly, the pace of malicious innovation may outstrip[23] any regulatory measures.

It’s more important than ever to establish external oversight[24] to make sure social media platforms work for the greater good, and not just short-term profit.

Read more: Facebook's algorithms fueled massive foreign propaganda campaigns during the 2020 election – here's how algorithms can manipulate you[25]

References

  1. ^ has revealed (www.aspi.org.au)
  2. ^ generative AI (www.techopedia.com)
  3. ^ Shadow Play (www.aspi.org.au)
  4. ^ ASPI (ad-aspi.s3.ap-southeast-2.amazonaws.com)
  5. ^ plausible deniability (www.cybersecurityintelligence.com)
  6. ^ the report (ad-aspi.s3.ap-southeast-2.amazonaws.com)
  7. ^ took down (about.fb.com)
  8. ^ Scams, deepfake porn and romance bots: advanced AI is exciting, but incredibly dangerous in criminals' hands (theconversation.com)
  9. ^ consumer protection (legalvision.com.au)
  10. ^ advertising regulation (www.accc.gov.au)
  11. ^ prohibit (support.google.com)
  12. ^ censorship (news.columbia.edu)
  13. ^ protected (www.abc.net.au)
  14. ^ exponentially amplify (www.technologyreview.com)
  15. ^ AI propaganda (www.govtech.com)
  16. ^ generate (www.technologyreview.com)
  17. ^ hyper-realistic (www.cambridge.org)
  18. ^ precursors (www.cambridge.org)
  19. ^ AI psy-ops (www.apa.org)
  20. ^ oversight framework (www.cambridge.org)
  21. ^ content moderation (cssh.northeastern.edu)
  22. ^ undermine (il.boell.org)
  23. ^ outstrip (www.mckinsey.com)
  24. ^ establish external oversight (ctb.ku.edu)
  25. ^ Facebook's algorithms fueled massive foreign propaganda campaigns during the 2020 election – here's how algorithms can manipulate you (theconversation.com)

Read more https://theconversation.com/an-ai-driven-influence-operation-is-spreading-pro-china-propaganda-across-youtube-219962

Times Magazine

How Decentralised Applications Are Reshaping Enterprise Software in Australia

Australian businesses are experiencing a quiet revolution in how they manage data, execute agreeme...

Bambu Lab P2S 3D Printer Review: High-End Performance Meets Everyday Usability

After a full month of hands-on testing, the Bambu Lab P2S 3D printer has proven itself to be one...

Nearly Half of Disadvantaged Australian Schools Run Libraries on Less Than $1000 a Year

A new national snapshot from Dymocks Children’s Charities reveals outdated books, no librarians ...

Growing EV popularity is leading to queues at fast chargers. Could a kerbside charger network help?

The war on Iran has made crystal clear how shaky our reliance on fossil fuels is. It’s no surpri...

TRUCKIES UNDER THE PUMP AS FUEL PRICES BECOME TWO THIRDS OF OPERATING COSTS FOR SOME BUSINESS OWNERS

As Australia’s fuel crisis continues, truck drivers across the nation are being hit hard despite t...

iPhone: What are the latest features in iOS 26.5 Beta 1?

Apple has quietly released the first developer beta of iOS 26.5, and while it may not be the hea...

The Times Features

Interest-free loans needed for agriculture amid fuel cr…

The Albanese Government should release the details of its plan to provide interest-free loans to b...

Next stage of works to modernise Port of Devonport

TasPorts is progressing the next stage of its QuayLink program at the Port of Devonport, with up...

‘Cuddle therapy’ sounds like what we all need right now…

Cuddle therapy is having a moment[1]. The idea for this emerging therapy is for you to book in...

The Decentralized DJ: How Play House is Rewriting the M…

The traditional music industry model is currently facing its most significant challenge since the ...

What Australians Use YouTube For

In Australia, YouTube is no longer just a video platform—it is infrastructure. It entertains, e...

Independent MPs warn NDIS funding cuts risk leaving vul…

Federal Independent MPs have called on the Albanese Government to provide greater transparency...

While Fuel Has Our Attention, There Are Many More Issue…

Australia is once again fixated on fuel. Petrol prices rise, headlines follow, political pressu...

Recent outbreaks highlight the risks of bacterial menin…

Outbreaks of bacterial meningococcal disease in England[1] and recent cases in students in New Z...

Nationals leader Matt Canavan promotes work from home t…

Nationals leader Matt Canavan has urged the embrace of work-from-home opportunities as a way to ...