The Times Australia
The Times World News

.

new technologies have long been weaponised against women. The solution involves us all

  • Written by Nicola Henry, Professor & Australian Research Council Future Fellow, Social and Global Studies Centre, RMIT University

Sexually graphic “deepfake” images of Taylor Swift[1] went viral on social media last week, fuelling widespread condemnation from Swifties, the general public and even the White House.

This problem isn’t new. Swift is one of many celebrities and public figures, mainly women[2], who have fallen victim to deepfake pornography in recent years. High-profile examples garner significant media attention, but the increasingly sophisticated nature of AI means anyone can now be targeted[3].

While there are grave concerns about the broader implications of deepfakes, it’s important to remember the technology isn’t the cause of abuse. It’s just another tool used to enact it[4].

Deepfakes and other digitally manipulated media

The sexually explicit deepfakes of Swift appeared on multiple social media platforms last week, including X (formerly Twitter), Instagram, Facebook and Reddit.

Most major platforms have bans on sharing synthetic and digitally manipulated media that cause harm, confusion or deception, including deepfake porn. This includes images created through simpler means such as photo-editing software. Nonetheless, one deepfake depicting Swift was viewed[5] 47 million times over a 17-hour period before it was removed from X.

There’s a long history[6] of digital technologies, apps and services being used to facilitate gender-based violence, including sexual harassment, sexual assault, domestic or family violence, dating abuse, stalking and monitoring, and hate speech.

As such, our focus should also be on addressing the problematic gender norms and beliefs that often underpin these types of abuse.

The emergence of deepfakes

The origins of deepfakes can be traced to November 2017 when a Reddit user called “deepfakes”[7] created a forum and video-editing software that allowed users to train their computers to swap the faces of porn actors with the faces of celebrities.

Since then, there’s been a massive expansion of dedicated deepfake websites and threads, as well as apps that can create customised deepfakes for free or for a fee.

In the past, creating a convincing deepfake often required extensive time and expertise, a powerful computer and access to multiple images of the person being targeted. Today, almost anyone can make a deepfake – sometimes in a matter of seconds.

Read more: Taylor Swift deepfakes: a legal case from the singer could help other victims of AI pornography[8]

The harms of deepfake porn

Not all applications of AI-generated imagery are harmful. You might have seen funny viral deepfakes such as the images of Pope Francis in a puffer jacket[9]. Or if you watch the latest Indiana Jones film, you’ll see Harrison Ford “de-aged” by 40 years[10] thanks to AI.

That said, deepfakes are often created for malicious purposes, including disinformation, cyberbullying, child sexual abuse, sexual extortion and other forms of image-based sexual abuse[11].

A report published[12] by startup Home Security Heroes estimated there were 95,820 deepfake videos online in 2023, a 550% increase since 2019.

When it comes to deepfake porn, women in particular are disproportionately targeted. According to DeepTrace[13], 96% of all deepfakes online are non-consensual fake videos of women[14]. These are mostly (but not exclusively) well-known actors and musicians.

Taylor Swift points at the audience while performing on stage
Swift’s Eras Tour will be coming to Australia this month. Natacha Pisarenko/AP

This is concerning but not surprising. Research[15] shows online abuse disproportionately affects women and girls, particularly Indigenous women, women from migrant backgrounds and lesbian, gay, bisexual, transgender and intersex people.

Public figures in particular face higher rates of online abuse[16], especially women and gender-diverse people. One study found celebrities are attributed more blame[17] than non-celebrities for the abuse they receive online, and this abuse is often viewed as less serious.

Research shows image-based abuse can result in significant harms[18] for victims, including anxiety, depression, suicidal ideation, social isolation and reputational damage. For public figures[19], deepfakes and other forms of online abuse can similarly result in diminished career prospects, withdrawal from public life and negative mental health outcomes.

In 2016, Australian activist and law reform campaigner Noelle Martin’s[20] photos were taken from social media and superimposed onto pornographic images. Martin reported feeling “physically sick, disgusted, angry, degraded, dehumanised” as a result. Digitally altered and deepfake images of Martin continue to circulate online without her consent.

Noelle Martin stands on a sidewalk with a large shadow cast behind her. Australian Noelle Martin found deepfake porn depicting her when she googled herself. Andres Kudacki/AP

Responding to deepfake porn

Anyone can be targeted through deepfakes. All that’s needed is an image of someone’s face. Even professional work images can be used.

Although law reform alone won’t solve this socio-legal problem, it can signal the issue is being taken seriously. We need laws specifically targeting non-consensual deepfake porn[21].

In Australia, there are image-based sexual abuse offences[22] in every Australian state and territory except Tasmania, as well as at the federal level. However, only some laws specifically mention digitally altered images (including deepfakes).

Technology companies could also do a lot more to proactively detect and moderate deepfake porn. They need to prioritise embedding “safety by design[23]” approaches into their services from the outset. This could mean:

  • designing and testing AI with potential misuses in mind
  • using watermarks and other indicators to label content as synthetic
  • “nudging” users to refrain from certain behaviours (such as using pop-ups to remind them about the importance of consent).

Research shows there are gaps in public understanding of deepfakes and how to detect them. This further highlights a need for digital literacy[24] and education on the difference between consensual and non-consensual uses of intimate images, and the harms of non-consensual deepfake porn.

Finally, and perhaps most importantly, we need to address the underlying systemic inequalities that contribute to technology-facilitated abuse against women and gender-diverse people. This includes recognising deepfake porn for the often-gendered problem it is – for celebrities and non-celebrities alike.

Read more: Celebrity deepfakes are all over TikTok. Here's why they're becoming common – and how you can spot them[25]

If this article has raised issues for you, or if you’re concerned about someone you know, call 1800RESPECT on 1800 737 732 or visit the eSafety Commissioner’s website[26] for help with image-based abuse. In immediate danger, call 000.

References

  1. ^ “deepfake” images of Taylor Swift (www.theguardian.com)
  2. ^ mainly women (regmedia.co.uk)
  3. ^ anyone can now be targeted (www.abc.net.au)
  4. ^ used to enact it (theconversation.com)
  5. ^ was viewed (www.aljazeera.com)
  6. ^ long history (www.tandfonline.com)
  7. ^ when a Reddit user called “deepfakes” (www.vice.com)
  8. ^ Taylor Swift deepfakes: a legal case from the singer could help other victims of AI pornography (theconversation.com)
  9. ^ Pope Francis in a puffer jacket (www.washingtonpost.com)
  10. ^ Harrison Ford “de-aged” by 40 years (theconversation.com)
  11. ^ image-based sexual abuse (theconversation.com)
  12. ^ report published (www.homesecurityheroes.com)
  13. ^ According to DeepTrace (www.vox.com)
  14. ^ are non-consensual fake videos of women (regmedia.co.uk)
  15. ^ Research (plan-international.org)
  16. ^ higher rates of online abuse (journals.sagepub.com)
  17. ^ are attributed more blame (www.researchgate.net)
  18. ^ significant harms (journals.sagepub.com)
  19. ^ public figures (journals.sagepub.com)
  20. ^ Noelle Martin’s (www.watoday.com.au)
  21. ^ non-consensual deepfake porn (theconversation.com)
  22. ^ image-based sexual abuse offences (www.imagebasedabuse.com)
  23. ^ safety by design (www.esafety.gov.au)
  24. ^ a need for digital literacy (theconversation.com)
  25. ^ Celebrity deepfakes are all over TikTok. Here's why they're becoming common – and how you can spot them (theconversation.com)
  26. ^ eSafety Commissioner’s website (www.esafety.gov.au)

Read more https://theconversation.com/taylor-swift-deepfakes-new-technologies-have-long-been-weaponised-against-women-the-solution-involves-us-all-222268

Times Magazine

Building an AI-First Culture in Your Company

AI isn't just something to think about anymore - it's becoming part of how we live and work, whether we like it or not. At the office, it definitely helps us move faster. But here's the thing: just using tools like ChatGPT or plugging AI into your wo...

Data Management Isn't Just About Tech—Here’s Why It’s a Human Problem Too

Photo by Kevin Kuby Manuel O. Diaz Jr.We live in a world drowning in data. Every click, swipe, medical scan, and financial transaction generates information, so much that managing it all has become one of the biggest challenges of our digital age. Bu...

Headless CMS in Digital Twins and 3D Product Experiences

Image by freepik As the metaverse becomes more advanced and accessible, it's clear that multiple sectors will use digital twins and 3D product experiences to visualize, connect, and streamline efforts better. A digital twin is a virtual replica of ...

The Decline of Hyper-Casual: How Mid-Core Mobile Games Took Over in 2025

In recent years, the mobile gaming landscape has undergone a significant transformation, with mid-core mobile games emerging as the dominant force in app stores by 2025. This shift is underpinned by changing user habits and evolving monetization tr...

Understanding ITIL 4 and PRINCE2 Project Management Synergy

Key Highlights ITIL 4 focuses on IT service management, emphasising continual improvement and value creation through modern digital transformation approaches. PRINCE2 project management supports systematic planning and execution of projects wit...

What AI Adoption Means for the Future of Workplace Risk Management

Image by freepik As industrial operations become more complex and fast-paced, the risks faced by workers and employers alike continue to grow. Traditional safety models—reliant on manual oversight, reactive investigations, and standardised checklist...

The Times Features

What Is the Dreamtime? Understanding Aboriginal Creation Stories Through Art

Aboriginal culture is built on the deep and important meaning of Dreamtime, which links beliefs and history with the elements that make life. It’s not just myths; the Dreamtime i...

How Short-Term Lenders Offer Long-Lasting Benefits in Australia

In the world of personal and business finance, short-term lenders are often viewed as temporary fixes—quick solutions for urgent cash needs. However, in Australia, short-term len...

Why School Breaks Are the Perfect Time to Build Real Game Skills

School holidays provide uninterrupted time to focus on individual skill development Players often return sharper and more confident after structured break-time training Holid...

Why This Elegant Diamond Cut Is Becoming the First Choice for Modern Proposals

Personalised engagement styles are replacing one-size-fits-all traditions A rising diamond cut offers timeless elegance with a softer aesthetic Its flexible design wo...

Is sleeping a lot actually bad for your health? A sleep scientist explains

We’re constantly being reminded by news articles and social media posts that we should be getting more sleep. You probably don’t need to hear it again – not sleeping enough i...

Ricoh Launches IM C401F A4 Colour MFP to Boost Speed and Security in Hybrid Workplaces

Ricoh, a leading provider of smart workplace technology, today launched the RICOH IM C401F, an enterprise-grade A4 colour desktop multifunction printer (MFP) designed for Austral...