The Times Australia
Google AI
The Times World News

.

What to do if you, or someone you know, is targeted with deepfake porn or AI nudes

  • Written by Nicola Henry, Professor & Australian Research Council Future Fellow, Social and Global Studies Centre, RMIT University
What to do if you, or someone you know, is targeted with deepfake porn or AI nudes

This week, about 50 female students from Victoria’s Bacchus Marsh Grammar School had fake, sexually explicit images of them shared without their consent[1] on Instagram and Snapchat. Images of their faces, purportedly obtained from social media, were stitched onto pornographic images using artificial intelligence (AI).

Deepfake porn, or what our team calls[2] “AI-generated image-based sexual abuse”, involves the use of AI to create a nude and/or sexual image of a person doing or saying things they haven’t said or done.

Celebrities and public figures[3], predominantly women, have experienced such abuse for nearly a decade, with various deepfake porn sites and “nudify apps” readily available online.

But as these technologies become more accessible and sophisticated, we’re starting to see this problem creep into our homes and schools. Teens – and even children – are now being targeted.

How widespread is deepfake abuse?

In 2023, my colleagues and I surveyed[4] more than 16,000 adults in ten countries and found that, despite widespread media coverage (particularly in Western countries), the concept of deepfake porn isn’t well known. When informed about it, however, most respondents indicated it should be criminalised.

Among respondents from Australia, 3.7% had been a victim of deepfake porn as an adult. This was the highest rate reported from the countries we surveyed.

At the same time, 2.4% of Australian respondents said they had created, shared or threatened to share a deepfake photo or video of another person without their consent. This too was a higher figure than every other country we surveyed except the United States.

Men were more likely to report being a victim of deepfake abuse, and more likely to report being a perpetrator. Men were also less likely to find the viewing, creating and/or sharing of deepfake pornography to be problematic.

What can you do if you’re targeted?

Image-based abuse can be a distressing experience. But victims should know they’re not alone, it isn’t their fault and there is plenty of help out there. Here are some steps they can take.

1. Report it

Creating or sharing deepfake sexual images of minors is a criminal offence under Australia’s federal child sexual abuse material[5] (“child pornography”) laws. It’s also a criminal offence to share non-consensual deepfake porn of an adult (and a crime to create it if you’re in Victoria).

Whether you’re the victim, or someone you know is, you can report deepfake abuse to digital platforms[6], to the Australian Centre to Counter Child Exploitation[7] (if the person depicted is a minor) and to the eSafety Commissioner[8].

School children in a blur.
Creating or sharing deepfake sexual images of minors is a criminal offence in Australia. LBeddoe/Shutterstock[9]

If you’re in danger, contact the police or ambulance on triple zero (000). If it’s not an emergency, you can call the Police Assistance Line[10] (131 444) or your local police station. The same steps apply if you’re a bystander who has come across non-consensual deepfake pornography of someone else online.

The eSafety commissioner can take action against image-based abuse under the federal Online Safety Act[11], and can work with victims and their supporters to get the content taken down within 24 hours. They can also issue formal warnings, take-down orders and civil penalties to individuals and technology companies that fail to take action.

Unfortunately, the deepfake content may continue to circulate even after it is taken down from the initial platform.

2. Seek help

If you’ve been targeted, it’s a good idea to talk to someone you trust, such as a friend, family member, teacher, counsellor or psychologist.

Our website has a list of relevant support services[12] for victim-survivors of image-based abuse, including specialist services for Aboriginal and Torres Strait Islander people, migrants and refugees, young people, people with disabilities, people from LGBTQI+ communities and sex workers.

Even if you’re not ready to talk about the experience, you can still find useful information about image-based abuse online, including on the eSafety commissioner’s website[13].

Two people holding hands. It’s a good idea to talk to someone you trust. fizkes/Shutterstock[14]

We’ve also developed a chatbot called Umibot[15], which provides free confidential advice and support to people who have experienced image-based abuse, including deepfake abuse. Umibot also has information for bystanders and perpetrators.

If you’re Aboriginal or Torres Strait Islander, you can check out WellMob[16]. This is an online resource made by Indigenous Australians to provide information on social and emotional wellbeing.

Resources for young people are also available from ReachOut[17], Beyond Blue[18], Youth Law Australia[19] and Kids Helpline[20].

3. Create a digital hash to stop the spread

The United Kingdom’s Revenge Porn Helpline and Meta have developed two digital hashing tools for victim-survivors. These are Stop NCII[21] for adults, and Take It Down[22] for minors.

Anyone in the world can use these tools to generate an anonymous digital hash (a unique numerical code) by scanning the image from their device. This hash is then shared with the companies participating[23] in the scheme (including Facebook, Instagram, Pornhub, TikTok and OnlyFans) so they may detect and block any matches on their platform. You aren’t required to upload the image, which means no one else sees it, nor does it leave your device.

It’s important to note this tool won’t block the image from appearing on platforms that aren’t part of the scheme. You also need to have access to the images in the first place to use the tool.

4. Block, report and distance yourself from the perpetrator (if it’s safe to do so)

You can block the perpetrator(s) through your mobile and on social media, and report them to the relevant platforms and authorities. In the case of platforms, it’s not always clear what will be done once a report is lodged, so it’s a good idea to ask about this.

If the perpetrator is someone you know, such as a classmate or student, authorities can take action to ensure you don’t interact with that person anymore.

Last week, a boy was expelled from Melbourne’s Salesian College[24] after he used AI to create sexually explicit images of a female teacher.

5. Boost your online safety

The eSafety commissioner has step-by-step video guides[25] on a range of online safety topics, from how to change your privacy settings on social media, to how to choose strong passwords.

For women experiencing family or domestic violence, the following resources may also be helpful:

References

  1. ^ shared without their consent (www.abc.net.au)
  2. ^ our team calls (arxiv.org)
  3. ^ Celebrities and public figures (theconversation.com)
  4. ^ surveyed (arxiv.org)
  5. ^ federal child sexual abuse material (www.criminalsolicitorsmelbourne.com.au)
  6. ^ digital platforms (www.imagebasedabuse.com)
  7. ^ Australian Centre to Counter Child Exploitation (www.accce.gov.au)
  8. ^ eSafety Commissioner (www.esafety.gov.au)
  9. ^ LBeddoe/Shutterstock (www.shutterstock.com)
  10. ^ Police Assistance Line (www.police.vic.gov.au)
  11. ^ Online Safety Act (www.esafety.gov.au)
  12. ^ relevant support services (www.imagebasedabuse.com)
  13. ^ eSafety commissioner’s website (www.esafety.gov.au)
  14. ^ fizkes/Shutterstock (www.shutterstock.com)
  15. ^ Umibot (umi.rmit.edu.au)
  16. ^ WellMob (wellmob.org.au)
  17. ^ ReachOut (au.reachout.com)
  18. ^ Beyond Blue (www.beyondblue.org.au)
  19. ^ Youth Law Australia (yla.org.au)
  20. ^ Kids Helpline (kidshelpline.com.au)
  21. ^ Stop NCII (stopncii.org)
  22. ^ Take It Down (tidstart.ncmec.org)
  23. ^ companies participating (stopncii.org)
  24. ^ Melbourne’s Salesian College (www.dailymail.co.uk)
  25. ^ video guides (www.esafety.gov.au)

Read more https://theconversation.com/what-to-do-if-you-or-someone-you-know-is-targeted-with-deepfake-porn-or-ai-nudes-232175

Times Magazine

With Nvidia’s second-best AI chips headed for China, the US shifts priorities from security to trade

This week, US President Donald Trump approved previously banned exports[1] of Nvidia’s powerful ...

Navman MiVue™ True 4K PRO Surround honest review

If you drive a car, you should have a dashcam. Need convincing? All I ask that you do is search fo...

Australia’s supercomputers are falling behind – and it’s hurting our ability to adapt to climate change

As Earth continues to warm, Australia faces some important decisions. For example, where shou...

Australia’s electric vehicle surge — EVs and hybrids hit record levels

Australians are increasingly embracing electric and hybrid cars, with 2025 shaping up as the str...

Tim Ayres on the AI rollout’s looming ‘bumps and glitches’

The federal government released its National AI Strategy[1] this week, confirming it has dropped...

Seven in Ten Australian Workers Say Employers Are Failing to Prepare Them for AI Future

As artificial intelligence (AI) accelerates across industries, a growing number of Australian work...

The Times Features

Macquarie Capital Investment Propels Brennan's Next Phase of Growth and Sovereign Tech Leadership

Brennan, a leading Australian systems integrator, has secured a strategic investment from Macquari...

Will the ‘Scandinavian sleep method’ really help me sleep?

It begins with two people, one blanket, and two very different ideas of what’s a comfortable sle...

Australia’s Cost-of-Living Squeeze: Why Even “Doing Everything Right” No Longer Feels Enough

For decades, Australians were told there was a simple formula for financial security: get an edu...

A Thoughtful Touch: Creating Custom Wrapping Paper with Adobe Firefly

Print it. Wrap it. Gift it. The holidays are full of colour, warmth and little moments worth celebr...

Will the Australian dollar keep rising in 2026? 3 factors to watch in the new year

After several years of steadily declining, the Australian dollar staged a meaningful recovery in...

The Daily Concerns for People Living in Hobart

Hobart is often portrayed as a lifestyle haven — a harbour city framed by Mount Wellington, rich...

Planning your next holiday? Here’s how to spot and avoid greenwashing

More of us than ever are trying to make environmentally responsible travel choices. Sustainable ...

AEH Expand Goulburn Dealership to Support Southern Tablelands Farmers

AEH Group have expanded their footprint with a new dealership in Goulburn, bringing Case IH and ...

A Whole New World of Alan Menken

EGOT WINNER AND DISNEY LEGEND ALAN MENKEN  HEADING TO AUSTRALIA FOR A ONCE-IN-A-LIFETIME PERFORM...