Google, Facebook, Twitter and other primary-generation corporations met Wednesday with U.S. government officials to discuss their plans to combat misinformation on social media in the run-up to the November election.
In a jointArray, corporations said it was the latest in a series of meetings “about what we see on our respective platforms and what we expect to see in the coming months.” He said today’s meetings focused on upcoming political conventions and “planning scenarios for election results.”
Technology corporations are under great pressure from government officials to prevent their platforms from being used by foreign actors and others to disrupt the 2020 elections, as happened in 2016.
Earlier this month, national director of counterintelligence and security, William Evanina, said Russia, China, Iran, and other “foreign actors” sought to “influence the personal tastes and perspectives of American voters, replace U.S. policies, and generate discord in the United States, and undermine the confidence of other Americans in our democratic process.”
In an interview with NPR’s morning edition, Facebook’s head of security policy, Nathaniel Gleicher, said Facebook is working harder than ever to fight those efforts, saying the purpose is for the electorate to get accurate information.
“I think I need to make the act of pretending to lie or deceive people more difficult,” Gleicher said.
In March, Facebook announced that it had deleted 49 Facebook accounts, 69 pages and 85 Instagram accounts that were involved in “foreign interference”. More recently, the social network has eliminated about two dozen accounts, some connected to Russia and Iran.
“We share data with independent researchers and then create the context so that users can see what’s going on on the platform,” Gleicher said.
Facebook has a team of fact checkers for misleading or fake content, according to the company, however, political advertising is not a subject of scrutiny, according to Facebook rules.
Gleicher defended this hands-off approach to political ads, which has allowed President Trump to spread falsehoods to millions of users about his presumptive Democratic opponent Joe Biden.
“So we know that right now, in this election, there are big debates about all the tactics that other people vote for and other people get involved. We need to make sure other people hear what elected officials say and what they think of the vote, “Gleicher told NPR. Array
“But, frankly, I think data is vital in the way other people will vote in the fall. So we need to make sure the data is there and that other people can see it, the warts and everything,” he said. Said.
However, Facebook has ignored its policy infrequently. In June, for example, he got rid of classified trump’s crusade ads containing a red triangle backwards, which had been used to identify political prisoners in German concentration camps during the Nazi period.
Twitter and Facebook also got rid of posts shared through Trump containing false and misleading data similar to the coronavirus pandemic.
Gleicher said Facebook systematically applies its opposing rules to QAnon. When asked about the Guardian’s findings, he admitted it’s hard to keep abretime from conspiracy campaigns.
“It’s a position where I think we have paintings to do,” Gleicher said. “I think the barriers around what constitutes a QAnon site are either not quite blurry and this becomes a challenge for all of us. That’s why we’re in this and exploring a few more steps we can take.”
Some have raised the option for Trump to simply reject the effects of the November election if he loses. In July, Trump refused in a Fox News interview to say whether he would settle for the result.
What would Facebook do if Trump wrongly said on the platform that he is the winner of the presidential election?
“We’re not going to know the effects right away. There will be an era of uncertainty because the count is still going on. It’s something you’ve signed up for,” Gleicher said.
“As a result of this era where voting comes and we don’t know the effects, we may be waiting for applicants to claim who won,” Gleicher said. “We may only be waiting for statements about whether the effects were correct or not. These are things we’ve already noticed around the election, but I think this time they’ll be fundamental.”
“How can you, as it should be, report the accusations that are being made,” he continued, “but also provide context to ensure that other people perceive and can weigh and make judgments about those things?”
Gleicher also promoted the company’s purpose of registering four million electorates by posting “voting centers” on Facebook and Instagram that provide up-to-date information on how to register, receive by mail or send ballots and where to vote.
“The explanation of why voting is a question of votes, and that is a must for our efforts and where we are as a company,” he said. “This is the most productive way to empower our leaders and solve vital problems.”
He added: “As a security person, it is to ensure that the electorate has accurate data on an election. Disdata flourishes in uncertainty, and we have noticed that other people take advantage of this uncertainty to conduct influence operations and other types of deceptive campaigns. “
Facebook is an NPR funder.