Facebook institutes sweeping ban on QAnon. Will it work?

By Suhauna Hussain

Los Angeles Times

In its most sweeping content policy decision to date, Facebook implemented a comprehensive ban on QAnon-related pages, groups and Instagram accounts Tuesday. The action represents a sharp escalation against purveyors of the vast conspiracy theory that baselessly claims that a shadowy, Satan-worshipping cabal of Democrats and other elites operates a child sex trafficking ring.

In August the company said it would remove content accounts and groups associated with QAnon only when they discussed potential violence. That initial policy led the company to remove 790 QAnon groups and restricted an additional 1,950 related to the conspiracy.

Three QAnon-focused groups identified by the Los Angeles Times that were online last week were unavailable Tuesday after the announcement.

In a Tuesday blog post, Facebook said that although QAnon posts may not directly promote violence, they are often nonetheless linked to “different forms of real world harm.” The company cited a barrage of fake claims in recent weeks by QAnon followers that wildfires ravaging the West Coast were started by members of leftist anarchist groups such as antifa, diverting the attention of local officials from the important task of managing the fires and protecting residents.

The company also noted it began directing users to credible child safety resources when they search for hashtags that have been co-opted by QAnon supporters, such as #SaveTheChildren, which refers to the false claim by conspiracy theorists that children have been kidnapped as part of the alleged human trafficking ring.

But the effort may be too little too late.

QAnon was a fringe movement when it sprouted in convoluted 4Chan posts in 2017. But the movement has grown enormously, bubbling into the mainstream this year and animating right-wing politics.

While Reddit and YouTube took earlier action against QAnon, Twitter and Facebook did not make moves to shut down or place limits on QAnon-linked accounts until this summer.

Reddit banned the main forum for QAnon conspiracy theories in fall 2018.

YouTube spokesman Alex Joseph said in an email the company has removed tens of thousands of QAnon-related videos and channels for violating the site’s hate and harassment policies since 2018. In early 2019, the company also aimed to reduce recommendations of videos referencing QAnon-related conspiracy theories.

It’s unclear how efforts to implement such a sweeping ban will play out. Facebook has already struggled and failed to institute previous policies comprehensively. In its August move, Facebook aimed to limit the reach of QAnon pages and accounts, even if the platform couldn’t ban them. Still, the company’s own algorithm recommended users to groups discussing the theory and groups continued to grow, adding hundreds of new members, a New York Times investigation in September found.

Since the ban in August, QAnon groups have come up with ways to become less explicit in their references to the conspiracy theory in order to escape sanctions by the company.

“I’m pleased to see Facebook take action against this harmful and increasingly dangerous conspiracy theory and movement,” U.S. Sen. Mark R. Warner, D-Va., said in a statement. “Ultimately the real test will be whether Facebook actually takes measures to enforce these new policies —we’ve seen in a myriad of other contexts, including with respect to right-wing militias like the Boogaloos, that Facebook has repeatedly failed to consistently enforce its existing policies.”