Apple Pulls AI Image Generators That Advertised Nonconsensual Nudes

Why Trust Techopedia
Key Takeaways

  • Apple has removed multiple AI image generator apps that were marketed for making nonconsensual nudes.
  • The move was in response to an investigation of the seemingly innocent apps.
  • This comes ahead of an expected generative AI push at WWDC 2024.

Apple has pulled three AI image generation apps from the App Store after an investigation found that their developers were advertising the ability to create nonconsensual nude pictures.

The original 404 Media report found through Meta’s Ad Library that the apps were marketing the ability elsewhere. Apple removed the apps after journalist Emanuel Maiberg shared the App Store links and associated ads.

Meta deleted the ads when alerted to their presence. Two additional ads were for strictly web-based AI generators.

The takedown signals a more aggressive response from Apple, as well as Google, to “deepfake” apps (which often superimpose a person’s face on another body) and other nonconsensual AI image creators. When apps like these were discovered in 2022, Apple and Google would only require that developers pull the ads, not the apps.

Deepfakes have been around for several years. However, they and other AI image tools have become growing problems as the technology becomes more accessible and sophisticated. The systems are in frequent use for porn, and students have been arrested for using “undress” apps that can make anyone appear naked.

There are strong incentives for Apple to crackdown on misuse of AI image generators. The iPhone maker is expected to launch a major AI push at WWDC 2024, with generative features and other technologies coming to multiple operating systems. The company is under pressure to show that it supports ethical uses of AI, particularly when generator makers have been accused of allowing offensive images or using copyrighted material for training.

Apple recently struck a deal with Shutterstock to use the stock photo supplier’s library for AI training. The effort ensured that image creators permitted use of their work, and that the source material would be safe to use.