Edgar Cervantes / Android Authority
TL;DR
- A brand new report discovered dozens of AI “nudify” and face-swap apps on the Apple App Retailer and Google Play Retailer.
- Regardless of insurance policies in opposition to sexual nudity, these apps had been distributed by means of the app shops. Most of the apps generated vital income from in-app purchases, of which the platforms take as a lot as a 30% reduce.
- Many of those apps had been eliminated following the report, however a number of are nonetheless obtainable for obtain.
AI is among the strongest instruments obtainable today, and you are able to do a lot with it. However, like several device, it may be used for good or dangerous. Over the previous few weeks, we’ve witnessed social media customers utilizing X’s Grok AI chatbot in lewd methods, primarily to create sexualized imagery of ladies with out their consent. Because it seems, it’s not simply Grok that has been on the heart of this undressing scandal, as each the Apple App Retailer and the Google Play Retailer allegedly hosted “nudify” apps.
Don’t need to miss the perfect from Android Authority?


Tech Transparency Undertaking discovered 55 apps within the Google Play Retailer that allowed for creating nude photos of ladies, whereas the Apple App Retailer hosted 47 such apps, with 38 being frequent between the 2 shops. These apps had been obtainable as of final week, although the report mentions that Google and Apple subsequently eliminated 31 and 25 apps, respectively, after the record of those nudify apps was shared with them.
For his or her investigation, the Tech Transparency Undertaking looked for phrases like “nudify” and “undress” throughout the 2 app shops and located dozens of outcomes. Many of those apps used AI to both generate movies or photos from consumer prompts, or to superimpose the face of 1 particular person onto one other’s physique, i.e., “face swapping.”
Alarmingly, apps like DreamFace (an AI picture/video generator, which continues to be obtainable on the Google Play Retailer however has been faraway from the Apple App Retailer) introduced no resistance when customers enter lewd prompts to point out bare girls. The app permits customers to create one video a day at no cost utilizing prompts, after which they must subscribe to paid options. The report cites AppMagic statistics to say that the app has generated $1 million in income.
Needless to say Google and Apple each cost a major share (as much as 30%) on in-app purchases, reminiscent of subscriptions, successfully benefiting from such dangerous apps.
Equally, Collart is one other AI picture/video generator that’s nonetheless obtainable on the Google Play Retailer however has been faraway from the Apple App Retailer. This app is claimed to not solely settle for prompts to nudify girls, however would additionally settle for prompts to depict them in pornographic conditions with no obvious restrictions. These are simply two examples, however the report mentions a number of extra with damning proof.
Face swap apps are much more dangerous and predatory, as they superimpose faces that the consumer doubtlessly is aware of onto bare our bodies. Apps like RemakeFace are nonetheless obtainable on each the Google Play Retailer and the Apple App Retailer on the time of writing, and the report confirms that they might simply be used to create non-consensual nudes of ladies.
Each the Google Play Retailer and the Apple App Retailer prohibit apps that depict sexual nudity. However because the report revealed, these apps had been distributed by means of app shops (with in-app subscriptions) regardless of their obvious violation of these shops’ insurance policies. It’s clear that app shops haven’t stored up with the unfold of AI deepfake apps that may “nudify” folks with out their permission. The platforms have a transparent duty to guard customers, and we hope they tighten their tips and focus extra on proactive monitoring of such apps fairly than reacting to studies.
We reached out to Google and Apple for feedback on this matter. We’ll replace this text after we hear again from the businesses.
Thanks for being a part of our neighborhood. Learn our Remark Coverage earlier than posting.


