One of five artificial intelligence companies sued for allegedly enabling internet users to “nudify” images of anyone without their consent has agreed to a settlement, but at least one website operator also named in the complaint is digging in for a fight over innovation and creative freedom.
The San Francisco City Attorney’s Office, acting on behalf of the State of California under state law, which deputizes city attorneys in certain types of cases, sued five companies and one website owner in August. In March, it dropped one company but added another, along with four other website owners.
The lawsuit accuses the companies and their owners of taking advantage of open-source AI software that allows users to modify real pictures of women and girls to show them naked, including their “intimate body parts.”
“Collectively, these websites have been visited hundreds of millions of times in the last year alone,” the lawsuit said.
Such non-consensual “deepfake” images can be used to “bully, threaten, and humiliate women and girls,” the lawsuit said, citing cases in California schools. The lawsuit also refers to an FBI advisory saying the agency “continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content” and then “publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion.”
The lawsuit seeks a court order forcing the companies and owners to take down websites allowing the creation of non-consensual deepfake pornography.
On Friday, a judge approved a settlement with Briver, a New Mexico company owning and operating websites named “Undresser.ai” and “Porngen.art.” Briver did not admit to any wrongdoing, but agreed it will not own, operate or help the operation of any website that uses AI “to convert clothed images of identifiable individuals into nude or sexually explicit images.” Briver will also pay a $100,000 fine.
Also named as defendants in the lawsuit are artificial intelligence companies Sol Ecom of Los Angeles; Itai Tech of the United Kingdom; Defirex and CodeBionic Labs of Estonia; and website operators Augustin Gribinets of Estonia; Artem and Bakhadyr Ashirbekov of Spain; and Gaofan Xu of China, none of which has responded in court to the lawsuit.
But website operator Richard Tang of Illinois last week filed in San Francisco Superior Court a full-throated attack on the lawsuit in defense of the right to create deepfake pornography.
“The history of the Internet is one where its earliest innovations were driven by the nearly unquenchable thirst for erotica,” the May 27 filing said. “AI image generation is no different.”
The lawsuit alleged Tang’s “Deepnude” website produced AI-generated, non-consensual intimate images of adults and children, and that Tan promoted it as a tool allowing users to “see any girl clothless with the click of a button.”
Tang is accused of failing to verify that people shown in images on the website “consented to the nudification of their respective images,” and of knowing that “the primary purpose” of websites like his is to create non-consensual sexualized images of “identifiable women and girls.”
“Users can upload an image of a clothed woman to Deepnude.cc, and the site will create a fake nude image of the subject,” the lawsuit alleged. “Because Tang has failed to deploy available technology to detect images of minors, users can upload an image of a clothed girl under 18 years old to Deepnude.cc, and the site will create a fake nude image of the subject.”
Tang claimed in his filing that there was no evidence his website was used to upload images depicting children.
“The claim that the technology can be used to create such materials is no more alarming than to claim that a camera can be used to take nude photos of children, or a pen can be used to write a bomb threat,” the filing argued.
“This technology can be used for entertainment, for humor, for commentary, or for any other legitimate reason within the universe of artistic expression,” the filing said. “Further, as time ravages all of our bodies, we sometimes might wish to remember ourselves younger, but few of us have nude glamour shots of ourselves from back when doing so might have given us a more pleasing result.”
Tang cited the website’s purported terms of service — attached to the lawsuit as an exhibit in the form of a copy of an archived web page — which threatened to ban users if they uploaded images copyrighted to someone else, or uploaded threatening, harassing material and illegal content such as child pornography and revenge porn.
“Any user who may have uploaded an image without permission of the subject or of a minor to create a ‘nudified’ image did so in contravention of the website’s terms and without authorization from Tang,” the filing said.
Although Tang said in the filing that he shut down the website about a month after the lawsuit was filed, he argued that he had no legal responsibility over its content because Section 230 of the U.S. Communications Decency Act shields website owners from liability over user-generated content. Tang also argued that his website used a third-party algorithm popular among “nudifying” website creators.
“The website served as nothing more than a middleman facilitating use of the … image generation algorithm,” the filing argued.
In September, a bill by California state Sen. Aisha Wahab (D-San Jose) was signed into law, making it a crime to create and distribute non-consensual deepfake porn. Last month, President Donald Trump signed the bipartisan “Take It Down Act” boosting penalties for knowingly publishing or threatening to publish such material.
The lawsuit, however, alleged violations of pre-existing laws against disclosing and distributing non-consensual sexual images and intimate images of children.
Ash Johnson, senior policy manager at the Washington, D.C., non-profit Information Technology and Innovation Foundation, said the Take It Down Act means Section 230 no longer applies to non-consensual deepfake pornography.
“Enforcement is definitely more complicated when overseas actors are involved,” Johnson said. “Unfortunately, this seems to be an inevitable obstacle when it comes to virtually every illegal online activity.”