Back

Apple and Google Promote Deepfake Nudify Apps in App Stores

Severity: Medium (Score: 57.8)

Sources: 9to5mac.com, www.techtransparencyproject.org, Appleinsider

Summary

A Tech Transparency Project investigation revealed that Apple and Google are inadvertently promoting 'nudify' apps that create deepfake nude images of women through search results and advertisements in their app stores. Approximately 40% of the top apps returned for searches related to nudification can render women nude or scantily clad. The investigation found that these apps have been downloaded 483 million times, generating over $122 million in revenue, with 31 of them rated suitable for minors. Apple and Google have faced criticism for their app review processes, which allowed these apps to remain available despite their harmful capabilities. Both companies have removed some apps following the report but have not fully addressed the underlying issues. The report highlights significant concerns regarding the lack of safeguards against nonconsensual deepfake content. Google stated that it is actively investigating reported violations, while Apple has been urged to improve its App Review process. Key Points: • Apple and Google app stores promote nudify apps that create deepfake images. • 40% of top search results for nudification yield apps capable of rendering nude images. • 31 nudify apps are rated suitable for minors, raising serious safety concerns.

Key Entities

  • Indonesia (country)
  • Android (platform)
  • Apple App Store (platform)
  • Google Play Store (platform)
  • IPhone (platform)
  • AI Replace & Remove — Fill App (tool)
  • Best Body AI — Fashion Editor (tool)
  • DreamFace: AI Video Generator (tool)
  • FaceTool: Face Swap & Generate (tool)
  • Grok (tool)
Loading threat details...

Threat Not Found

The threat cluster you're looking for doesn't exist or has been removed.

Return to Feed