First design
Site theme
San Francisco City Attorney David Chiu is suing to shut down 16 of the most popular Internet sites and apps that allow users to “strip” or “strip” images of most of the women and groups of women they They are harassed and exploited by bad actors online.
These sites, according to Chiu’s lawsuit, are “intentionally” designed to “create fake nude photographs of women and girls without their consent,” and boast that all users can upload any photo to “see anyone who is naked. ” ” employing a generation that genuinely swaps faces. of real patients in specific photographs generated by AI.
“In California and across the country, there has been a sharp increase in the number of women and men who are harassed and subjected to AI-generated non-consensual intimate photography (NCII) and “this worrying trend shows no signs of slowing down. ” “Chiu said. says the costume.
“Given the wide availability and popularity” of nudification websites, “San Franciscans and Californians are at risk of themselves or their loved ones being victimized in this manner,” Chiu’s lawsuit warns.
At a news conference, Chiu said this “one-of-a-kind lawsuit” filed to protect not only Californians, but also “a shocking number of women and girls around the world,” from celebrities like Taylor Swift to middle-aged and big. . women from the best school. If the city official wins, each nudify site faces fines of $2,500 for each violation of California’s customer coverage law.
In addition to media reports sounding the alarm about the damage caused by AI, law enforcement has joined the call to ban “deepfakes. “
Chiu said destructive deepfakes are occasionally created “by leveraging open source AI symbol generation models,” such as previous versions of Stable Diffusion, which can be subtle or “subtle” to gently “strip” images of women and women from whom they are eliminated. of social networks. While later versions of Stable Diffusion make that “disruptive” bureaucracy of use much more difficult, San Francisco city officials noted at the press convention that earlier subtle versions of Stable Diffusion are still widely available for use. use by bad actors.
In the United States alone, police have been so bogged down with reports of fake AI child sex photos lately that it’s difficult to investigate cases of child abuse offline, and those AI cases are expected to continue to pile up “exponentially. “AI-related abuse has spread so widely that “the FBI has warned of an increase in extortion schemes employing non-consensual AI-generated pornography,” Chiu said at the press conference. “And the effect on patients has been devastating,” damaging “their reputation and intellectual health,” causing a “loss of autonomy” and “in some cases, pushing Americans to become suicidal. “
Continuing on behalf of others in the state of California, Chiu is a court order requiring owners of nudification sites to cease operating “all internet sites that they own or operate that are capable of creating non-consensual intimate photographs of identifiable individuals generated by AI. “”This is the only way,” Chiu said, “to hold those sites accountable “for creating and distributing AI-generated NCII on women and women and for aiding and abetting others to perpetrate this behavior. “
It also needs an order requiring “all domain call registrars, domain call logs, Internet servers, payment processors, or corporations that offer authentication and authorization or user interfaces” to “avoid” site operators from nudifying the launch of new sites to save you more. misconduct. .
Chiu’s complaint redacts the names of the most harmful sites his research uncovered, but states that in the first six months of 2024, those sites “were visited more than two hundred million times. “
Although those affected generally have few legal recourse, Chiu believes that federal and state legislation banning deep-fake pornography, revenge and child pornography, as well as California’s unfair festival law, can be used to take down the 16 sites. Chiu is hoping for a win to warn other operators of nudification sites that more takedowns are likely.
“We are filing this lawsuit to shut down those websites, but we must also sound the alarm,” Chiu said at the press conference. “Generative AI is very promising, but as with all new technologies, it has unintended consequences and criminals seek to exploit them. We will have to be transparent that this is not an innovation. This is sexual abuse. “
Join Ars Orbital Transmission mail to receive weekly updates in your inbox. Sign up →