US social media giant Meta has sued the Hong Kong-based company behind an app that uses artificial intelligence to create fake nude images of people without their consent, seeking to stop the firm from advertising the app on Meta platforms.
Meta said last Thursday that it had filed the complaint against Joy Timeline HK Limited, the maker of the CrushAI app, amid what it called a “concerning growth of so-called ‘nudify’ apps online.”
Joy Timeline HK Limited allegedly violated Meta’s rules by running CrushAI ads with at least 170 business accounts it created on Meta-owned Facebook and Instagram, CNN said citing the complaint filed to the District Court.
Some of the ads included AI-generated nude or sexually explicit images with captions such as “upload a photo to strip for a minute” and “erase any clothes on girls,” according to CNN.
Now, the tech giant is seeking an injunction to restrain the Hong Kong firm ”from creating, sharing, publishing, disseminating or contributing to the publication” of such advertisements on its platforms, media outlets reported.

The order would target any content relating to apps designed to generate AI or deepfake images containing nudity or NCII [non-consensual intimate image sharing] elements.
The tech giant is also seeking to claim back the US$289,200 (HK$2.28 million) it said it spent to take down, monitor, and investigate the ads that the Hong Kong company had allegedly bought since September 2023, The Witness reported.
Meta said in a statement last week that its lawsuit “follows multiple attempts by Joy Timeline HK Limited to circumvent Meta’s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules.”
“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,” it said.
Per its policies, Meta removes ads and Facebook and Instagram pages promoting so-called “nudify” apps, blocks links on its platforms, and restricts related search terms so that AI-generated non-consensual sexually explicit images are not circulated, it added in its statement.
Meta said its lawsuit is part of broader action against these apps.
“We’re building new technology to detect ads for nudify apps and sharing signals about these apps with other tech companies so they can take action too,” it said.











