
This was never going to be trouble: a nudifying app that converts images into nudes.
Now eSafety has made the provider of three of the world’s most widely used ‘nudify’ apps to withdraw its service from Australia after school children were subjected to mischief.
The providers were marketing features like undressing ‘any girl,’ and with options for ‘schoolgirl’ image generation and features such as ‘sex mode’.
eSafety issued the UK-based company with an official warning in September for allowing its services to be used to create artificially generated child sexual exploitation material.
The ‘nudify’ services provided by the company were receiving approximately 100,000 visits a month from Australians and have featured in high-profile cases related to the creation of AI generated sexual exploitation material of students in Australian schools.
It comes as global AI model hosting platform Hugging Face has also taken key steps to comply with Australian law after warnings from eSafety certain generative AI models it hosts are being misused by Australians to create AI-generated child sexual exploitation material.
Hosting platforms - like Hugging Face - act like the ‘gatekeepers’ of distribution of these powerful AI models, much the same way the more traditional app stores do, so it’s equally important to ensure they also have measures in place to protect children.
eSafety Commissioner Julie Inman Grant said, “We know ‘nudify’ services have been used to devastating effect in Australian schools and with this major provider blocking their use by Australians we believe it will have a tangible impact on the number of Australian school children falling victim to AI-generated child sexual exploitation.
“There have been instances where models were downloaded from AI model hosting platforms like Hugging Face by Australian users and used to create child sexual exploitation material, including depictions of real children and survivors of sexual abuse. We’ve also seen them host these so-called ‘nudify’ models which we’ve seen used with devastating impacts on Australian school students.”
Following engagement from eSafety about compliance concerns, Hugging Face has now changed their terms of service so that all account holders are required take steps to minimise the risks associated with models that they upload, specifically to prevent misuse to generate child sexual exploitation material or pro-terror material.
From now on, if the company becomes aware that their new terms have not been complied with, whether from user reports or its own safety efforts, it is required to enforce the terms.
If Hugging Face fails to take appropriate action for a breach of its terms, eSafety could take enforcement action. eSafety has a range of enforcement mechanisms under the Online Safety Act where any company fails to comply with an industry code or standard, including seeking penalties of up to $49.5m.