Shutterstock

NSW introduces new deepfake  laws

Technology

This week an Australian man was fined over $340,000 for sharing deepfake images of well-known women. The ruling comes after NSW introduces new laws against image-based abuse.

Trigger warning: This story mentions sexual and image-based abuse

In a landmark Australian case, a man has been fined $343,500 for sharing deepfake pornographic images of well-known women. Anthony Rotondo uploaded these images onto the now defunct site MrDeepFakes.com and, despite court orders, continued to email the images to regulators and media.

The ESafety Commission brought the case against him almost two years ago. When the ruling was announced, the commissioner Julie Inman Grant said that the ruling sent a clear, powerful message about the serious harm caused by image-based abuse.

The ruling came a few weeks after the New South Wales government made creating and sharing AI-generated sexually explicit deepfakes illegal. Those who share the images face up to three years in prison, fines up to $11,000, or both.

The state government also made it a crime to create, record and distribute sexually explicit text or sexually explicit audio designed to sound like an identifiable person using AI.

A man holds a microphone NSW premier Chris Minns said the new laws sent a strong message that those who target women using AI will face consequences.Creative Commons image

When the reforms were announced, NSW premier Chris Minns said: "This legislation sends a clear message: those who seek to target women using this technology now face serious consequences."

What are deepfakes and how are they generated?

According to the Australian Human Rights Commission, deepfake sexual material or deepfake pornography covers all intimate or sexual images and videos created with AI.

Deepfake sexually explicit text covers fabricated messages that describe sexual acts, designed or changed by AI to make it look like it was written by someone else.

There has been a rapid rise in these types of content because of easy and cheap access to open-source AI image generation. In 2022, the global market for AI image generators was valued at about US$257 million. It’s projected to grow to be worth US$917 million by 2030.

Deepfake videos first appeared in 2017, when a Reddit user started using a machine learning algorithm to swap female celebrities' faces onto bodies of pornographic actresses.

In 2019, the Nudify apps were launched, making it easy for users to upload photos of real women that the software quickly 'stripped' to produce false nude images.

This type of content is becoming increasingly popular.

A study by Graphika, an American social network analysis firm, revealed that 34 providers of non-consensual intimate images attracted over 24 million visitors to their websites. Referral spam marketing for these services has skyrocketed by more than 2,000% on platforms like Reddit and X since early 2023.

And the study reported that one million people participate in 52 telegram groups focused on non-consensual intimate image services.

What’s the link between deepfakes and gendered online abuse?

There is a clear link between deepfakes and gendered online abuse.

"Non-consensual, sexually explicit deepfakes are a form of abuse disproportionately targeted at women and young girls," NSW Attorney General Micheal Daley said in a statement.

US researcher Emily Chapman found that perpetrators assert their control over women’s identities and bodies by modifying images and videos. They use technology to digitally objectify women.

Survivors of this type of digital abuse often experience trauma, anxiety, depression and a profound loss of trust in others.

Lawyer and activist Noelle Martin is a survivor of deepfake abuse.

"Deepfake abuse affects a person over the course of their life, potentially has impacts on their employability, on their mental health, physical health, on their interpersonal relationships. Every aspect of their life is affected," she says.

woman standing in the UN headquarters Noelle Martin is an advocate for survivors of deepfake abuse.Supplied

This type of content often remains online indefinitely, causing lasting reputational damage, while threatening careers and relationships.

Deepfakes also silence women, discouraging them from engaging in digital spaces, reinforcing gender stereotypes and undermining the trustworthiness of online conversations.

What are other states and federal government doing?

With these new laws, NSW joins South Australia and Victoria where it’s already a crime to create and share deepfakes.

In 2024, the federal government passed an amendment to the criminal code to address deepfake sexual material abuse.

The amendment means that it's now a crime to share sexual content via any communication services if those depicted in the images don’t consent or if the sharer is reckless about consent.

judge's gavel next to a graphic of AI It's a crime in Australia to distribute deepfake sexual material on any communication platform.Shuttershock

Will the law be effective?

While the reforms are a step forward, experts warn that their impact might be limited.

"These laws are a step in the right direction," says Martin. "[However] these reforms will mainly serve an educational and deterrent purpose, sending a clear message that this behaviour is criminal and unacceptable."

"Since deepfake abuse is a worldwide problem without borders, there is only so much that NSW can do to address it. Even with these reforms, women in NSW can still be targeted and have little to no meaningful recourse."

The challenge lies in enforcement: how police prioritise these offences, how courts handle cases and what penalties are handed down.

Policing the tech ecosystem is another challenge. "The bigger problem governments will need to grapple with are all the corporate actors in the deepfake abuse pipeline who enable, facilitate, profit from or direct traffic to this abuse," Martin says.

"Without going after the supply chain, the pipeline, this abuse will continue to occur at scale."

Martin says, even with the laws in place, holding perpetrators accountable cannot undo the pain victim-survivors have endured. The effects of deep fake abuse can last a lifetime, leaving survivors to manage ongoing harm that laws can’t fully address.

While legal action can provide some recognition, it often does not go far enough in helping victims rebuild their lives or find true justice.

What support is available for survivors?

There are several organisations that offer support, advocacy and crisis assistance to those affected by deepfake abuse or image-based exploitation.

Survivors can access online abuse support tools, such as Esafety services that generate a digital footprint to prevent re-uploads, or StopNCII.org, which helps adults remove their images from online platforms.

The Survivor Hub, a Sydney-based survivor-led organisation offers peer-led in-person support groups and a private Facebook group for those affected by sexual assault to connect, share experiences and access support. Call 1800 385 578

Dolly’s Dream provides a support line and resources aimed at shielding young people from bullying, harassment and abuse in online environments. Call 0488 881 033

Lifeline offers free, confidential crisis support 24/7 for anyone facing distress, anxiety or thoughts of self-harm. Call 13 11 14

Related stories

AI hiring is a double-edged sword of efficiency and bias

AI hiring is a double-edged sword of efficiency and bias

Locked out of school for being a girl in Afghanistan

Locked out of school for being a girl in Afghanistan

Why young people led the Nepal protests

Why young people led the Nepal protestsShutterstock

MOST RECENT

©2025 UNSW Sydney All Rights Reserved.
Logo for Hamburger menu