Late News

Facebook says it will extend its QAnon ban

The news: Facebook announced on Tuesday that it will remove “any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content.” QAnon, the pro-Trump conspiracy theory centering on the belief that the president of the United States is at war with a secret satanic pedophile ring run by liberals, has grown into an “omniconspiracy” in recent months. Accordingly, it has become a powerful distributor of conspiratorial thinking on a variety of topics—including misinformation about the pandemic and the presidential elections.  

The context: This goes further than the less intense ban announced in August. At the time, Facebook said it would remove pages, groups, and accounts containing “discussions of potential violence.” By then, QAnon had inspired a growing list of destructive, sometimes violent, acts. In 2019, the FBI concluded that QAnon was potentially capable of inspiring violence.  

Why now? QAnon flourished for years on social media before this summer, and many critics felt that Facebook’s partial ban was too little, too late. But it was likely prompted by the theory’s staggering growth on social media since March (an internal Facebook study this summer found that QAnon-associated groups had millions of members). Today’s announcement referred to QAnon’s involvement in spreading dangerous misinformation during the wildfires in the western United States as another reason for the more aggressive ban. 

Brian Friedberg, a senior researcher at the Harvard Shorenstein Center’s Technology and Social Change Project who has been tracking QAnon since its early days, said in a text message that while the announcement will likely fuel rumors among QAnon supporters that this ban amounts to “election interference” against Trump, the timing suggests that Facebook is trying to “AVOID further spread of election disinfo” from QAnon’s distribution networks by acting now. 

QAnon believers were expecting this: Although QAnon