'Nudify' apps have been banned. Here's why they have such an impact on kids
Shutterstock

'Nudify' apps have been banned. Here's why they have such an impact on kids

Technology

Artificial intelligence and 'nudify' apps are making it easier for offenders to create sexualised images of children. The impact for young people can be devastating.

Warning: This story discusses child sexual abuse and online sexual exploitation.

Last week the eSafety Commission announced that the provider of three major 'nudify' apps has withdrawn their services from Australia following enforcement action.

This move comes after years of these tools spreading through schools and social platforms, turning everyday photos into sexualised content within seconds.

Experts say such apps are fuelling a rise in sextortion, deepfake abuse, and online grooming. But often shame and fear can make many of those who fall victim to these predators stay silent.

Newsworthy's Raahat Shaik sat down with Dr Sarah Napier from the Australian Institute of Criminology, Dr Pooja Sawriker and Mikaela Jago from ICMEC Australia to discuss how AI is transforming the child protection landscape, why cultural silence can leave some children feeling more alone, and what needs to change to keep young people safe.


"They're taking normal photos of real children and making them sexual," says Dr. Sarah Napier from the Australian Institute of Criminology

If you or someone you know needs support, you can contact:

Related stories

MOST RECENT

©2025 UNSW Sydney All Rights Reserved.
Logo for Hamburger menu