The Times Australia
The Times World News

.

Whether of politicians, pop stars or teenage girls, sexualised deepfakes are on the rise. They hold a mirror to our sexist world

  • Written by Anastasia Powell, Professor, Family & Sexual Violence, RMIT University
Whether of politicians, pop stars or teenage girls, sexualised deepfakes are on the rise. They hold a mirror to our sexist world

Victorian MP Georgie Purcell recently spoke out[1] against a digitally edited image in the news media that had altered her body and partially removed some of her clothing.

Whether or not the editing was assisted by artificial intelligence (AI), her experience demonstrates the potential sexist, discriminatory and gender-based harms that can occur when these technologies are used unchecked.

Purcell’s experience also reflects a disturbing trend in which images, particularly of women and girls, are being sexualised, “deepfaked” and “nudified” without the person’s knowledge or consent.

Read more: Nine was slammed for 'AI editing' a Victorian MP's dress. How can news media use AI responsibly?[2]

What’s AI got to do with it?

The term AI[3] can include a wide range of computer software and smartphone apps that use some level of automated processing.

While science fiction[4] might lead us to think otherwise, much of the everyday use of AI-assisted tools is relatively simple. We teach a computer program or smartphone application what we want it to do, it learns from the data we feed it, and it applies this learning to perform the task in varying ways.

A problem with AI image editing is that these tools rely on the information our human society has generated. It is no accident that instructing a tool to edit a photograph of a woman might result in it making the subject look younger, slimmer and/or curvier, and even less clothed[5]. A simple internet search for “women” will quickly reveal that these are the qualities our society frequently endorses.

A woman sits on a computer using photo editing software to alter a photo of a woman in underwear
While digitally retouching real photos has been happening for years, fake images of women are on the rise. Shutterstock[6]

Similar problems have emerged in AI facial recognition tools that have misidentified suspects in criminal investigations due to the racial and gender bias that is built into the software. The ghosts of sexism[7] and racism, it seems, are literally in the machines.

Technology reflects back to us the disrespect, inequality, discrimination and – in the treatment of Purcell – overt sexism that we ourselves have already circulated.

Sexualised deepfakes

While anyone can be a victim of AI-facilitated image-based abuse, or sexualised deepfakes, it is no secret that there are gender inequalities in pornographic imagery found online.

Sensity AI (formerly Deeptrace Labs) has reported[8] on online deepfake videos since December 2018 and consistently found that 90–95% of them are non-consensual pornography[9]. About 90% of them are of women.

Young women, children and teens across the globe are also being subjected to the non-consensual creation and sharing of sexualised and nudified deepfake imagery. Recent reports of faked sexualised images of teenage girls have emerged from a New Jersey high school in the United States[10] and another high school in Almendralejo, Spain[11]. A third instance was reported in a London high school[12], which contributed to a 14-year-old girl taking her own life.

Read more: Taylor Swift deepfakes: new technologies have long been weaponised against women. The solution involves us all[13]

Women celebrities are also a popular target of sexualised deepfake imagery. Just last month, sexualised deepfakes of Taylor Swift[14] were openly shared across a range of digital platforms and websites without consent.

While research data on broader victimisation and perpetration rates of this sort of image editing and distribution is sparse, our 2019 survey[15] across the United Kingdom, Australia and New Zealand found 14.1% of respondents aged between 16 and 84 had experienced someone creating, distributing or threatening to distribute a digitally altered image representing them in a sexualised way.

Our research also shone light on the harms of this form of abuse. Victims reported experiencing psychological, social, physical, economic, and existential trauma, similar to harms identified by victims of other forms of sexual violence and image-based abuse.

This year, we have begun a new study[16] to further explore the issue. We’ll look at current victimisation and perpetration rates, the consequences and harms of the non-consensual creation and sharing of sexualised deepfakes across the US, UK and Australia. We want to find out how we can improve responses, interventions and prevention.

How can we end AI-facilitated abuse?

The abuse of Swift in such a public forum has reignited a call for federal laws and platform regulations, moderation and community standards to prevent and block sexualised deepfakes from being shared.

Stunningly, while the non-consensual sharing of sexualised deepfakes is already a crime in most Australian states and territories, the laws relating to their creation[17] are less consistent. And in the US, there is no national law[18] criminalising sexualised deepfakes. Fewer than half of US states have one, and state laws are highly variable in how much they protect and support victims.

The head and shoulders of a blonde woman looking just past the camera Sexualised deepfake images of Taylor Swift were circulated widely online. Jordan Strauss/AP

But focusing on individual states or countries is not sufficient to tackle this global problem. Sexualised deepfakes and AI-generated content are perpetrated internationally, highlighting the need for collective global action.

There is some hope we can learn to better detect AI-generated content through guidance in spotting fakes. But the reality is that technologies are constantly improving, so our abilities to differentiate the “real” from the digitally “faked” are increasingly limited.

Read more: Celebrity deepfakes are all over TikTok. Here's why they're becoming common – and how you can spot them[19]

The advances in technology are compounded by the increasing availability of “nudify” or “remove clothing” apps on various platforms and app stores, which are commonly advertised online. Such apps further normalise the sexist treatment and objectification of women, with no regard for how the victims themselves may feel about it.

But it would be a mistake to blame technology alone for the harms, or the sexism, disrespect and abuse that flows from it. Technology is morally neutral. It can take neither credit nor blame.

Instead, there is a clear onus on technology developers, digital platforms and websites to be more proactive by building in safety by design[20]. In other words, putting user safety and rights front and centre in the design and development of online products and services. Platforms, apps and websites also need to be made responsible for proactively preventing, disrupting and removing non-consensual content and technologies that can make such content.

Australia is leading the way with this through the Office of the eSafety Commissioner, including national laws[21] that hold digital platforms to account.

But further global action and collaboration is needed if we truly want to address and prevent the harms of non-consensual sexualised deepfakes.

References

  1. ^ recently spoke out (www.abc.net.au)
  2. ^ Nine was slammed for 'AI editing' a Victorian MP's dress. How can news media use AI responsibly? (theconversation.com)
  3. ^ term AI (www.sas.com)
  4. ^ science fiction (www.denofgeek.com)
  5. ^ less clothed (www.washington.edu)
  6. ^ Shutterstock (www.shutterstock.com)
  7. ^ ghosts of sexism (www.heraldsun.com.au)
  8. ^ Sensity AI (formerly Deeptrace Labs) has reported (sensity.ai)
  9. ^ found that 90–95% of them are non-consensual pornography (www.vox.com)
  10. ^ New Jersey high school in the United States (apnews.com)
  11. ^ Almendralejo, Spain (theconversation.com)
  12. ^ London high school (metro.co.uk)
  13. ^ Taylor Swift deepfakes: new technologies have long been weaponised against women. The solution involves us all (theconversation.com)
  14. ^ sexualised deepfakes of Taylor Swift (www.vice.com)
  15. ^ our 2019 survey (academic.oup.com)
  16. ^ new study (research.monash.edu)
  17. ^ the laws relating to their creation (research.monash.edu)
  18. ^ no national law (www.msnbc.com)
  19. ^ Celebrity deepfakes are all over TikTok. Here's why they're becoming common – and how you can spot them (theconversation.com)
  20. ^ safety by design (www.esafety.gov.au)
  21. ^ national laws (www.esafety.gov.au)

Read more https://theconversation.com/whether-of-politicians-pop-stars-or-teenage-girls-sexualised-deepfakes-are-on-the-rise-they-hold-a-mirror-to-our-sexist-world-222491

Times Magazine

When Touchscreens Turn Temperamental: What to Do Before You Panic

When your touchscreen starts acting up, ignoring taps, registering phantom touches, or freezing entirely, it can feel like your entire setup is falling apart. Before you rush to replace the device, it’s worth taking a deep breath and exploring what c...

Why Social Media Marketing Matters for Businesses in Australia

Today social media is a big part of daily life. All over Australia people use Facebook, Instagram, TikTok , LinkedIn and Twitter to stay connected, share updates and find new ideas. For businesses this means a great chance to reach new customers and...

Building an AI-First Culture in Your Company

AI isn't just something to think about anymore - it's becoming part of how we live and work, whether we like it or not. At the office, it definitely helps us move faster. But here's the thing: just using tools like ChatGPT or plugging AI into your wo...

Data Management Isn't Just About Tech—Here’s Why It’s a Human Problem Too

Photo by Kevin Kuby Manuel O. Diaz Jr.We live in a world drowning in data. Every click, swipe, medical scan, and financial transaction generates information, so much that managing it all has become one of the biggest challenges of our digital age. Bu...

Headless CMS in Digital Twins and 3D Product Experiences

Image by freepik As the metaverse becomes more advanced and accessible, it's clear that multiple sectors will use digital twins and 3D product experiences to visualize, connect, and streamline efforts better. A digital twin is a virtual replica of ...

The Decline of Hyper-Casual: How Mid-Core Mobile Games Took Over in 2025

In recent years, the mobile gaming landscape has undergone a significant transformation, with mid-core mobile games emerging as the dominant force in app stores by 2025. This shift is underpinned by changing user habits and evolving monetization tr...

The Times Features

Why Mobile Allied Therapy Services Are Essential in Post-Hospital Recovery

Mobile allied health services matter more than ever under recent NDIA travel funding cuts. A quiet but critical shift is unfolding in Australia’s healthcare landscape. Mobile all...

Sydney Fertility Specialist – Expert IVF Treatment for Your Parenthood Journey

Improving the world with the help of a new child is the most valuable dream of many couples. To the infertile, though, this process can be daunting. It is here that a Sydney Fertil...

Could we one day get vaccinated against the gastro bug norovirus? Here’s where scientists are at

Norovirus is the leading cause[1] of acute gastroenteritis outbreaks worldwide. It’s responsible for roughly one in every five cases[2] of gastro annually. Sometimes dubbed ...

Does running ruin your knees? And how old is too old to start?

You’ve probably heard that running is tough on your knees – and even that it can cause long-term damage. But is this true? Running is a relatively high-impact activity. Eve...

Jetstar announces first ever Brisbane to Rarotonga flights with launch fares from just $249^ one-way

Jetstar will start operating direct flights between Brisbane and Rarotonga, the stunning capital island of the Cook Islands, in May 2026, with launch sale fares available today...

Introducing the SE 2 and Mini hair dryers from Laifen

The Mane Attractions for Professional Styling at Home Without the Price Tag Fast, flawless hair is now possible with the launch of Laifen’s two professional quality hair dryers th...