QAnon still rampant on Twitter despite July takedown

In its broad strokes, these findings do not deviate significantly from Twitter’s public portrayals of the effects of its move against QAnon, which came after more than 2 1/2 years of mounting evidence about the hateful, violent nature of the conspiracy theory and its penchant for sparking real-world crimes. The House of Representatives voted Friday to condemn QAnon.

Twitter has said it sought to eliminate accounts committing violations against its rules on harassment, hate speech and incitement to violence but also wanted to allow QAnon supporters to continue operating on the platform — albeit with new restrictions — so long as they followed platform policies. Overall, the company says its action caused discussion of the conspiracy theory to fall by more than half.

The researchers, however, found troubling evidence that Twitter has not yet done enough and that the conspiracy theory continues to “persist and expand” on the site, said

Read More

Facebook and Twitter missed QAnon warnings for years

Others just craved speed: “TREASON = FIRING SQAUD [sic] OR HANGING! DO IT NOW PLEASE THAT’S THE LAW! ! ! ! ! ! ! ! ! ! ! ! ! !”

These posts — from January 2018, just months after QAnon flamed to life from the embers of Pizzagate, with its false claims of a child sex ring run by Democrats out of a Washington pizzeria — were among the many early warnings that the new conspiracy theory was fueling hatred and calls for violence on Facebook, Twitter and other social media.

But it would be years before Facebook and Twitter would make major moves to curb QAnon’s presence on their platforms, despite serious cases of online harassment and offline violence that followed, and moves by other social media companies to limit the spread of QAnon’s lurid and false allegations of pedophilia and other crimes.

One social media company, Reddit,

Read More

Facebook, Twitter flounder in QAnon crackdown

But the social media companies still aren’t enforcing even the limited restrictions they’ve recently put in place to stem the tide of dangerous QAnon material, a review by The Associated Press found. Both platforms have vowed to stop “suggesting” QAnon material to users, a powerful way of introducing QAnon to new people.

But neither has actually succeeded at that.

Twitter is even

Read More

How Facebook Can Slow QAnon for Real

This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

Like other dangerous ideas, the QAnon conspiracy is tricky to root out online. But it’s not impossible.

QAnon is a sprawling and false set of theories that powerful institutions are controlled by pedophile cannibals who are plotting against President Trump. It’s also a chameleon. Supporters use legitimate causes like protecting children or promoting wellness to appeal to newcomers and then draw them into their outlandish ideas.

QAnon adherents tailored their ideas for Facebook, which moved slowly to address the movement at first. Facebook announced in August that it was restricting QAnon activity, but so far its actions haven’t accomplished much, my colleagues Sheera Frenkel and Tiffany Hsu wrote.

I talked with Sheera about how much blame Facebook deserves for the spread of this dangerous conspiracy, and what we can learn from internet

Read More