Artificial intelligence and 'nudify' apps are making it easier for offenders to create sexualised images of children. The impact for young people can be devastating.
Warning: This story discusses child sexual abuse and online sexual exploitation.
Last week the eSafety Commission announced that the provider of three major 'nudify' apps has withdrawn their services from Australia following enforcement action.
This move comes after years of these tools spreading through schools and social platforms, turning everyday photos into sexualised content within seconds.
Experts say such apps are fuelling a rise in sextortion, deepfake abuse, and online grooming. But often shame and fear can make many of those who fall victim to these predators stay silent.
Newsworthy's Raahat Shaik sat down with Dr Sarah Napier from the Australian Institute of Criminology, Dr Pooja Sawriker and Mikaela Jago from ICMEC Australia to discuss how AI is transforming the child protection landscape, why cultural silence can leave some children feeling more alone, and what needs to change to keep young people safe.
"They're taking normal photos of real children and making them sexual," says Dr. Sarah Napier from the Australian Institute of Criminology
If you or someone you know needs support, you can contact:
- 1800 Respect National Helpline on 1800 737 732
- Lifeline (24-hour crisis line) on 131 114
- Kids Helpline on 1800 551 800
- Bravehearts information & support line for child sexual abuse on 1800 272 831
Related stories
Raahat is a fourth year Law/Media (Communications and Journalism) student at UNSW. She's passionate about justice-driven stories that empower, uplift and hold those in power to account.


I’m sick of throwing good food in the bin. What can we do about it?