Stop Facebook is ‘too big to be moderated’

To review this article, select My Profile and then View Recorded Stories.

To review this article, select My Profile and then View Recorded Stories.

Gilad Edelman

To review this article, select My Profile and then View Recorded Stories.

To review this article, select My Profile and then View Recorded Stories.

On Monday, a new disdata video about coronavirus exploded on the Internet. Created through the right-wing Breitbart online page, it is an excerpt from a press convention through an organization that call itself U.S. number one care physicians. It contains dangerously false claims about the coronavirus, and adds that the mask is dead and that chloroquine cures the disease. (No known cure). The video is a check on the public policies of social media platforms opposed to pandemic misdatums and, through some measures, adopted them. By Tuesday morning, Facebook, Twitter and YouTube had deleted the message for violating their policies on false data about remedies and remedies for Covid.

For Facebook, the episode can be considered a specific success. Many people, in addition to the company’s own employees, argued that it was moving too slowly in reaction to false and destructive messages on the platform. Here, Facebook is the first major platform to act. There’s only one problem: the video had already been watched more than 20 million times when Facebook got rid of it Monday night, according to NBC News. The horse miles away before the barn doors closed.

On the eve of a high-profile congressional audience on antitrust issues and festivals in primary technologies, the episode revived an unusual Facebook complaint that the platform is too big to handle well, even when it has the right policies. As Charlie Warzel of the New York Times said on Twitter, “Facebook can’t handle the wrong information on its scale. If videos can be broadcast so widely before the company authenticates them (as is often the case often), then there is no genuine hope. It is not about finding a solution, the platform is the problem ».

It’s a very popular view, but it doesn’t make much sense. It is true that no site that relies on user-generated content and has millions or billions of users will be able to fully enforce its content regulations on a giant scale. But in no industry, with the imaginable exception of airlines and nuclear plants, we recommend that anything that is not the best is a failure. No one is saying that there are too many people in the world to enforce legislation on a large scale; we only use a ton of cops. (Of course, the protest motion opposed to police violence strongly argued that these budgets would be spent more elsewhere, a consultation for some other article.) The challenge is whether Facebook can get out of it now: it takes so long to decipher in a clearly misleading video created through one of its own official press partners and has already been noticed through tens of millions of users, a scenario that does not go from one misinformation crisis to another. And there is no explanation why to think that it is possible that it does not move towards this purpose if you only invested more resources in this task.

“They want to rent more content moderators, much more,” said Jennifer Grygiel, a communications professor at Syracuse University. “It’s a myth to create this concept that it’s too big to be moderate, there’s too much content.”

In 2019, CEO Mark Zuckerberg said Facebook would spend more than $3.7 billion on platform security, more, he said, than Twitter’s overall annual revenue. The much more applicable figure, however, is Facebook’s revenue, which last year was around $70 billion. In other words, Zuckerberg claimed credits for spending just over 5% of the company’s revenue on securing its products.

While Facebook slightly broke Forbes’ rating of the 100 most giant corporations in terms of earnings last year, its $24 billion in tax earnings made it one of the most successful in the world. For what? Because their costs are much lower than the maximum of other giant corporations.” Ford Motor Company made a profit of $160 billion in 2018, but generated only $4.3 billion in profits before taxes. Building cars costs money. Ford faces a fierce festival of many other automakers, meaning it is driven to invest in the manufacturing of cars that other people need to drive and assess the costs that other people are willing to pay, in the meantime, will have to meet the broad protection and emissions needs imposed by the government.

Facebook faces any of those pressures. It has few genuine festivals and its activities are almost completely free of government regulations. That’s why the company stays free to spend almost everything it needs for content moderation and data verification. But there is no explanation for why this is written in stone. Yes, Facebook receives millions of new messages a day, making moderation a daunting task. But Facebook also has billions of dollars in cash. You may only triple this $3.7 billion security investment while maintaining an enviable profit margin.

I don’t pretend to know the exact amount Facebook deserves to spend. Recently, Facebook claims to employ 15,000 content moderators, the most of whom are entrepreneurs. According to The Verge, these moderators earn as little as $15 per hour in gruem conditions. A recent in-depth report from New York University’s Stern Center for Business and Human Rights recommends that the company avoid outsourcing content moderators and double their number to 30,000. But we can dream a little more. Facebook can triple its moderators by less than $1 billion. This can triple that workforce and double their wages, so content moderators earn $60,000 a year, for just over $2 billion. It’s hard to say precisely what effect it would have on this, but it’s also hard to believe that taxing Facebook’s investment wouldn’t make much difference. A world in which only one million more people were exposed to Breitbart’s video, because a framework of moderators with more staff and support was able to report and manage faster, is greater than a world in which another 20 million people were. The same goes for a global where videos like this manage to be broadcast much less frequently, even if they are not completely deleted.

It is vital to be transparent about the nature of the disorders posed by Facebook’s unprecedented extension and dominance on social media. (The same goes for the other giants, adding to Google, who will be questioned at Wednesday’s hearing.) Actually, it’s not about the number of users. Few voices call for Facebook’s division, the platform, not the corporate division. People must be on the social network where they are all; It would be strange to check how to combine existing Facebook users into other feuds. Meanwhile, even if the government forced the company to participate beyond the acquisitions of Instagram and WhatsApp, which would be a big step forward in enforcing antitrust laws, Facebook would still have its 2 billion users and the same upscale upheaval.

If Facebook’s length is a problem, it’s not because it’s to generate moderation. Rather, it’s because the company’s dominance in the social media market, along with a regulatory vacuum, allows you to make massive profits, no matter what you do. Regulations can be enforced; it only costs money. Not applying the regulations also has prices. They simply end up on the company’s balance sheet, not Facebook’s.

Leave a Comment

Your email address will not be published. Required fields are marked *