Google Play Lists AI Nudify Apps Available To All Users

AI “nudify” apps that use artificial intelligence to create nonconsensual nude or sexually explicit images are being offered broadly through the Google Play Store, according to recent reporting, escalating scrutiny of how major app marketplaces handle this category of software.
The reports say these apps are available to general users on Google’s Android app marketplace, despite platform policies that are intended to limit sexual content and exploitation. Separate coverage also describes similar issues involving Apple’s App Store, with users being directed to “nudify” apps through app discovery features.
Android Police reported that AI “nudify” apps are being offered to everyone on the Google Play Store. Engadget and the Tech Transparency Project separately reported that Apple and Google were pointing users to “nudify” apps, raising concerns about how store search, suggestions, and recommendations can surface apps that generate sexually explicit images.
Other outlets, including NDTV Profit, The News International, digit.in, and 9to5Mac, echoed the same core allegation: that major mobile app stores have facilitated access to “nudify” tools even though both companies have rules that purport to restrict this type of content and behavior.
The development matters because “nudify” apps are closely tied to image-based sexual abuse and the creation of fake sexual imagery without consent. When widely distributed through the dominant mobile app stores, these tools can be easier for users to find and download, increasing the risk of harm to targets, including minors and other vulnerable people.
It also puts pressure on Apple and Google to demonstrate that their enforcement mechanisms work at scale. App-store policies are often presented as a primary line of defense against exploitative or abusive apps, and the effectiveness of those policies is directly tied to app review, enforcement, and the behavior of discovery systems that recommend or surface apps to users.
The reporting further highlights a broader challenge for platforms: even if certain apps are removed, new listings can appear, and app descriptions may not explicitly advertise prohibited behavior. That dynamic can make enforcement difficult, but it does not remove the responsibility of app-store operators to prevent distribution of software designed for abuse.
What happens next will likely depend on how Apple and Google respond to the allegations described in the reports, including whether they remove identified apps, adjust search and recommendation systems, or take other steps to limit app discovery. Continued scrutiny from watchdog groups and press outlets is also expected as questions persist about how these apps are able to reach mainstream distribution channels.
For users, the reports serve as a warning that app-store availability is not, by itself, proof that an app is safe, appropriate, or compliant with platform rules—especially as generative AI tools become more common and more capable. The larger test now is whether the biggest mobile marketplaces can prevent nonconsensual sexual imagery tools from being offered so broadly in the first place.
