How Facebook Can Slow QAnon for Real

Table of Contents

This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

Like other dangerous ideas, the QAnon conspiracy is tricky to root out online. But it’s not impossible.

QAnon is a sprawling and false set of theories that powerful institutions are controlled by pedophile cannibals who are plotting against President Trump. It’s also a chameleon. Supporters use legitimate causes like protecting children or promoting wellness to appeal to newcomers and then draw them into their outlandish ideas.

QAnon adherents tailored their ideas for Facebook, which moved slowly to address the movement at first. Facebook announced in August that it was restricting QAnon activity, but so far its actions haven’t accomplished much, my colleagues Sheera Frenkel and Tiffany Hsu wrote.

I talked with Sheera about how much blame Facebook deserves for the spread of this dangerous conspiracy, and what we can learn from internet companies’ prior crackdown on terrorist recruitment.

First, QAnon beliefs have been linked to real-world violence.

Plus, Facebook says it wants a “healthy community.” Does it believe these conspiracies are a part of that?

How much blame does Facebook deserve for QAnon’s growth?

When Facebook changed its focus to encourage people to gravitate to smaller, more intimate groups, it inadvertently created safe havens for people to discuss how to spread QAnon theories.

Facebook needs to ask itself if it has a responsibility for fueling QAnon and think through the consequences of that.

Have any internet companies managed to slow the spread of ideas related to QAnon?

Reddit used to be ground zero for QAnon, until it banned a whole section of the site dedicated to the conspiracy in 2018. There is still QAnon stuff on Reddit, but the content largely moved elsewhere — including to Facebook.

Could things have been different for conspiracies on Facebook, too?

I wonder how different our world would look if Facebook, YouTube and Twitter joined Reddit in taking coordinated, effective action against QAnon. That’s what the companies did in 2015 when the Islamic State was using social media to recruit new followers. You could see almost in real time that ISIS lost much of its ability to recruit online.

In my mind, that was the clearest example of the internet companies — when they were motivated to do so — taking action to remove a dangerous group that was pervasive on their sites. This action was supported by the White House, and the internet companies felt empowered to make an overwhelming show of force.

Now, though, tech companies are divided over what to do about QAnon, and they don’t have clear direction from the administration. We’ve not seen condemnation of QAnon from the White House, let alone support for social media companies to restrict its spread.

Here’s what Android users need to do to install apps via the Chrome web browser:

From here, do a web search for the application file you’re looking for and download it through the website.

Another warning: Make sure you are downloading what you’re looking for. Sometimes bogus and dangerous software is disguised as the official version of an app.

Good luck, and be careful.


I have to restrain myself from putting red pandas in this spot every day. They are the best. Here is Lin from the Cincinnati Zoo eating apples and bananas. Did you know red pandas have semi-opposable thumbs?!


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.

Source Article