Google may not downgrade top deepfake sites unless victims report it en masse

First page design.

Site Theme

Today, Google announced new measures to combat the immediate appearance of specific, non-consensual deepfakes generated by AI in its search results.

Due to “growing concern about the number of photographs and videos generated that show other people in particular sexual contexts, distributed over the Internet without their consent,” Google said it consulted with “experts and victim-survivors” to make “important updates” to its website. Search engine widely used to “protect others more”.

Specifically, Google has enabled targets of specific fake photographs (which, according to experts, are mostly women) to report and remove fake images that appear in search results. In addition, Google has taken steps to downgrade certain deepfaux “to prevent this type of content from appearing in the top search results,” the world’s leading search engine said.

Victims of deepfake pornography have in the past criticized Google for not being more proactive in its fight against deepfakes in search results. Showing photographs and marking them all is a “process that takes time and energy” and a “constant battle”. Kaitlyn Siragusa, a Twitch gamer with a particular OnlyFans run through deepfakes, told Bloomberg last year.

In response, Google worked to “make the process easier,” in part by “helping others solve this challenge at scale. ” Now, when a victim submits a removal request, “Google’s systems will also attempt to remove any component effects of searches on them,” Google’s blog says. And once a deepfake is “successfully removed,” Google will “search for and remove any duplicates of that symbol it finds,” the blog says.

Google’s efforts to declassify false and destructive content have also intensified, the tech giant said. For other people targeted through deepfauxs, Google will now “reduce particular fake content” for searches that include other people’s names. According to Google, this step alone “reduced exposure to the effects of particular symbols in those types of queries by more than 70%. “

However, Google is still reluctant to downgrade general searches that could lead other people to destructive content. A quick Google search confirms that general searches with keywords like “Celebrity Nude Deepfake” direct searchers to popular destinations where they can search for intimate, non-consensual photographs of celebrities or request photographs of other lesser-known people.

For victims, the end result is that problematic links will continue to appear in Google search results for those who need to keep scrolling or to deliberately search for “deepfauxs. ” The only step Google has taken recently to demote major deepfaux sites like Fan-Topia or MrDeepFakes is a promise to demote “sites that have seen a high volume of takedowns of certain fake images. “

It’s not yet clear what Google considers “top volume,” and Google declined Ars’ request for comment on whether those sites would eventually be downgraded. Instead, a Google spokesperson told Ars that “if we get a higher volume of sites, we effectively get rid of from an express online page under this policy, use it as a rating signal, and demote the site in queries about where the site might appear. “

Currently, the Google spokesperson said, Google is focused on demoting “queries that come with individual names,” which “have the greatest potential for individual harm. “But more queries will degrade in the coming months, the Google spokesperson said, and Google continues to face the “technical challenge for search engines” of differentiating between “real, consensual particular content (such as nude scenes of an actor)” and “fake” content (such as deepfakes in which said actor appears),” Google’s blog said.

“This is an ongoing effort and we plan more innovations in the coming months to address a wider variety of queries,” the Google spokesperson told Ars.

In its blog, Google said that “these efforts are meant to give other people greater peace of mind, especially if they are involved in content about them appearing in the future. “

But many deepfaux victims have claimed that spending hours or even months cutting down destructive content doesn’t give us hope that the photos won’t reappear. More recently, deepfaux victim Sabrina Javellana told the New York Times that even after her The state of Florida passed an anti-deepfaux law, it did not stop fake photographs from spreading online.

He stopped trying to get rid of photographs anywhere, telling the Times, “This never ends. I just have to settle for it. “

According to U. S. Rep. Joseph Morelle (D-N. Y. ), a federal law that opposes deepfakes will be needed to deter more bad actors from harassing and terrorizing women with deepfake pornography. He introduced one such law, the Intimate Image Deepfake Prevention Act, which would criminalize the creation of deepfakes. Lately, it has 59 sponsors in the House and bipartisan support in the Senate, Morelle said at a panel organized this week through the National Organization for Women and the Campaign to Ban Deepfakes, in which Ars participated.

Morelle said he had spoken to victims of deepfakes, including teenagers, and made the decision that “a national ban and a national set of civil and criminal remedies made the most sense” to combat the challenge with “urgency. “

“A patchwork of other state and local jurisdictions with other rules” would be “very difficult to follow” for sufferers and perpetrators seeking to perceive what is legal, Morelle said, while federal legislation imposing liability and penalties on offenders would likely have “the greatest impact. “

Victims, Morelle said, suffer mental, physical, emotional and monetary damage every day and, as co-responder Andrea Powell pointed out, there is no cure because there is ultimately no justice for survivors of a “prolific” era. a catastrophic accumulation of those abuses,” Powell warned.

Join Ars Orbital Transmission mail to receive weekly updates in your inbox. Sign up →

Leave a Comment

Your email address will not be published. Required fields are marked *