Apple, Google offer ‘nudify’ apps despite policies against them


From its app store searches, the group identified 18 apps with nudifying capabilities in the Apple App Store and 20 in the Google Play Store. — Photo by James Yarema on Unsplash

Apple Inc and Google have continued to offer mobile apps that let users make nonconsensual sexualised images of people despite their policies prohibiting such content, according to a report published Wednesday by the Tech Transparency Project.

Searching for terms like "nudify” and "undress” in the Apple and Google app download stores gives customers access to software that can be used to alter images of celebrities and others to make them appear nude or in a state of partial undress, according to the group, a research arm of the nonprofit Campaign for Accountability. The companies also run ads for similar nudifying apps in their search results.

Apps identified by the group have been downloaded 483 million times and generated US$122mil (RM482mil) in revenue, according to the report, which cited revenue estimates from market researcher AppMagic. A spokesperson for AppMagic said the Tech Transparency Project’s work has resulted in several apps being removed and prompted others to change their user policies.

Over the past year, politicians around the world have ratcheted up calls to curb the spread of nudifying apps. Earlier this year, the companies removed apps flagged by the Tech Transparency Project. But just a few months later, dozens of similar ones could be found, researchers from the organisation said.

"It’s not just that the companies are failing to actually appropriately review these apps and continue to approve them and profit from them,” Katie Paul, director of the project, said in an interview. "They are actually directing users to the apps themselves.”

From its app store searches, the group identified 18 apps with nudifying capabilities in the Apple App Store and 20 in the Google Play Store. In addition, both Apple and Google sometimes directed users to the apps via their autocomplete feature by suggesting the names of more nudifying apps as users typed, the researchers said.

Some of the apps used names and images that cast them in a sexual light. Others could easily be used for that purpose despite not being marketed as such, making them more convenient than traditional photo-editing software. Some offered subscriptions, the Tech Transparency Project said.

Apple’s App Store guidelines for developers ban "overtly sexual or pornographic material.” The Google Play Store bans "apps that degrade or objectify people, such as apps that claim to undress people or see through clothing, even if labelled as prank or entertainment apps.

Google said that many of the apps referenced in the report have been suspended from Google Play for violations of its policies, and an investigation is ongoing.

"When violations of our policies are reported to us, we investigate and take appropriate action,” the company said in an email.

Apple said it removed 15 apps identified by the group after Bloomberg reached out about their presence. Among the apps taken down were PicsVid AI Hot Video Generator, which offered templates that featured women sucking on phallic lollipops, according to the researchers. PicsVid’s developer didn’t respond to a request for comment.

Another app identified by the Tech Transparency Project, Uncensored AI – No Filter Chat, stripped clothes from an image of a woman uploaded by the researchers. A representative of Uncensored AI’s developer said the app no longer allows removal of clothes.

Apple said it contacted the developers of six apps to alert them to issues that need to be addressed and that they are at risk of being removed. Other apps mentioned by the Tech Transparency Project didn’t violate the company’s guidelines, Apple said. The company added that it has proactively rejected many apps and removed others. 

The tech giants’ enforcement efforts are "uneven and largely opaque,” according to Anne Helmond, a professor at Utrecht University in the Netherlands.

"If an app presents itself as a generic image generator, it may pass review, even if it can be misused in practice,” said Helmond, who is a director of the App Studies Initiative, an international research group. "Visibility is shaped by ranking and search systems that reward engagement, meaning that controversial uses can increase an app’s prominence.”

One of the apps identified by the researchers in the Google Play Store, Video Face Swap AI: DeepFace, advertised swapping actress Anya Taylor-Joy’s face onto Game of Thrones character Daenerys Targaryen. But inside the app, under a category called Girls, users could paste people’s faces onto video templates of women bouncing their breasts or shaking their hips, Bloomberg found. The app, which is rated "E” for Everyone, has been downloaded over 1 million times from the store, where users could find it by typing "face swap” into the search bar.

Okapi Software, the company that offers Video Face Swap AI, said it had launched an investigation into the issues raised by Bloomberg and removed some of the content, which it said had been uploaded by users. 

"Our app does not offer ‘nudify’ functionality, and we do not permit the generation of nude or sexually explicit content,” Okapi said. "We take content safety and compliance seriously.”

A growing chorus of regulators are calling for the companies to do more to uphold their policies. Last year, President Donald Trump signed the Take It Down Act, which criminalises the publication of non-consensual sexual content and compels social media and websites to remove such posts.” In April, the UK government plans to introduce legislation that would open a path to prosecute tech executives whose companies do not take down such images. – Bloomberg

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

YouTube suspends pro-Iran channel posting Lego-style clips mocking Trump
Starmer summons TikTok, Meta, X bosses over children's online safety
Musk's staff reaches out to suppliers for Terafab project, Bloomberg News reports
Russia toughens restrictions on VPNs amid Internet crackdown
GSMA: AI-driven chip shortage slowing efforts to get world online
Adobe releases AI assistant for creative tools, says it will work with Anthropic's Claude
Allbirds shares jump over 400% on plans to pivot to AI from sneakers
Cadence, Nvidia working together on developing AI for robotics
ECB to quiz bankers about risks of Anthropic's new AI model, source says
UK financial watchdog to consult on proposed crypto regulations

Others Also Read