Facebook vows to restrict users if US election descends into chaos

United States Map - State with glow with malicious code background in a 1970 dot matrix font on a computer screen. 8K Resolution ready.
Enlarge / United States Map – State with glow with malicious code background in a 1970 dot matrix font on a computer screen. 8K Resolution ready.

Matt Anderson Photography | Getty Images

Facebook has said it will take aggressive and exceptional measures to “restrict the circulation of content” on its platform if November’s presidential election descends into chaos or violent civic unrest.

In an interview with the Financial Times, Nick Clegg, the company’s head of global affairs, said it had drawn up plans for how to handle a range of outcomes, including widespread civic unrest or “the political dilemmas” of having in-person votes counted more rapidly than mail-in ballots, which will play a larger role in this election due to the coronavirus pandemic.

“There are some break-glass options available to us if there really is an extremely chaotic and, worse still, violent set of circumstances,” Mr Clegg said, though he stopped short of elaborating further on what measures were on the table.

The proposed actions, which would probably go further than any previously taken by a US platform, come as the social media group is under increasing pressure to lay out how it plans to combat election-related misinformation, voter suppression and the incitement of violence on the November 3 election day and during the post-election period.

It also comes as concerns mount that even US president Donald Trump himself could take to social media to contest the result or call for violent protest, potentially triggering a constitutional crisis.

“We have acted aggressively in other parts of the world where we think that there is real civic instability and we obviously have the tools to do that [again],” Mr Clegg added, citing the previous use of “pretty exceptional measures to significantly restrict the circulation of content on our platform.”

Facebook refused to go into detail over its plans for election-related content control, as malicious actors might use that information to proactively work out how to game the system. However, during previous periods of unrest in Sri Lanka and Myanmar, the company took action including reducing the reach of content shared by repeated rule-breakers, and limiting the distribution of “borderline content” that was sensationalist but did not quite breach its hate speech rules.

Facebook is bracing for what is likely to be a highly polarizing election, and remains in the spotlight after it failed to catch attempts by Russia to manipulate the 2016 US vote. There are fears that Mr Trump could try to interfere in the process this time, since he has already refused to commit to accepting the outcome, argued it could be rigged and sought to delegitimize postal voting.

Against this backdrop, Facebook has been exploring how to handle about 70 different potential scenarios, according to a person familiar with the situation, with staff including world-class military scenario planners.

So far, it has announced several new election misinformation and voter suppression policies in recent weeks, including stating that it will add cautionary labels to posts in which campaigns or candidates prematurely claim victory, for example.

According to Mr Clegg, any high-stakes decisions will fall to a team of top executives including himself and chief operating officer Sheryl Sandberg—with chief executive Mark Zuckerberg holding the right to overrule positions.

“We’ve slightly reorganized things such that we have a fairly tight arrangement by which decisions are taken at different levels [depending on] the gravity of the controversy attached,” Mr Clegg said.

The executive also said that “the amount of resources we are throwing at this is very considerable.” Facebook will have a virtual war room—dubbed its “Election Operations Centre”—for monitoring for suspicious activity and updating its “voter information hub,” which will showcase verified results to users, he said.

As well as fighting a rising tide of misinformation from both foreign and domestic operatives, experts warn that Facebook must prevent the platform from being used to foment violence.

Earlier this month, Facebook was criticized for failing to shut down an armed militia group that was encouraging armed citizens to protest in Kenosha, Wisconsin, shortly before a 17-year-old did so and fatally shot two people.

Mr Clegg said Facebook was carrying out “proactive sweeps” for dangerous groups and incitement, including “in areas where we know that their activity is likely to be more pronounced in other parts of the country.”

© 2020 The Financial Times Ltd. All rights reserved Not to be redistributed, copied, or modified in any way.

Source Article