The San Francisco Metropolis Legal professional’s workplace is suing 16 of probably the most steadily visited AI-powered “undressing” web sites, typically used to create nude deepfakes of girls and ladies with out their consent. The landmark lawsuit, introduced at a press convention by Metropolis Legal professional David Chiu, says that the focused web sites have been collectively visited over 200 million instances within the first six months of 2024 alone.
The offending web sites permit customers to add pictures of actual, absolutely clothed individuals, that are then digitally “undressed” with AI instruments that simulate nudity. Certainly one of these web sites, which wasn’t recognized inside the grievance, reportedly advertises: “Think about losing time taking her out on dates, when you may simply use [the redacted website] to get her nudes.”
The web site operators are accused of violating state and federal legal guidelines banning revenge pornography, deepfake pornography, and baby pornography, alongside California’s unfair competitors regulation as a result of “the hurt they trigger to shoppers enormously outweighs any advantages related to these practices,” based on the grievance submitting. The lawsuit is looking for civil penalties, along with taking the web sites offline and completely stopping their purveyors from creating future deepfake pornography.
“This investigation has taken us to the darkest corners of the web, and I’m completely horrified for the ladies and ladies who’ve needed to endure this exploitation,” Chiu mentioned on X. “It is a massive, multi-faceted drawback that we, as a society, want to unravel as quickly as doable.”