Facebook’s recommendations systems, designed to prioritize the growth of groups, most likely supercharged the QAnon community — exposing scores of people to the conspiracy theory and then forging bonds among like-minded believers who could communicate, organize and spread their message further. As NBC News’s Ben Collins notes, this spread has intensified during the coronavirus pandemic as QAnon has become a hub for public health misinformation on Facebook. According to The Wall Street Journal, “the average membership in 10 large public QAnon Facebook groups swelled by nearly 600 percent from March through July, to about 40,000 from about 6,000.”
QAnon followed a similar growth strategy on platforms like YouTube, building channels around influencers savvy enough to game the platform’s recommendation algorithms. On Twitter, the communities formed around the successful manipulation of hashtags, efforts amplified by the Trump campaign and the president’s Twitter feed. (On Friday Mr. Trump refused to answer whether he supported QAnon.)
This online ecosystem has been attractive to some political candidates. “Politicians see the infrastructure QAnon has built on these platforms. They recognize it as increasing in power and see it as having a political benefit,” said Alex Kaplan, a researcher for the media watchdog group Media Matters for America who has been tracking the increase in QAnon supporters running for Congress. “There are true believers, yes, but many also see pandering to QAnon as a way to cultivate political support. They say, ‘why not use this infrastructure to get some benefit?’ — be it followers or money or votes.” Mr. Kaplan has reported that there are at least 20 candidates on the ballot in November who support or have spoken favorably of QAnon.
The overtures of campaigns like Ms. Greene’s — and President Trump’s — are only likely to become more overt as QAnon moves further into the mainstream. Journalists like Mr. Kaplan are concerned that more media coverage will lead to the conspiracy theory being normalized. “People should be worried. They should not get used to this,” he told me. “It’s crucial to remember this all started as a theory on a message board linked to white nationalists and trolls that President Trump was involved in a secret plot to take down the deep state and pedophiles. That’s what all of this is.”
For those who’ve been following and reporting on QAnon since its earliest days, this week has been disorienting and disheartening. “It’s a horrifying, humbling and depressing feeling to have seen something like this back when it was just a few forum posts, warn of its potential to infect the nation and end up right,” Paris Martineau, a technology reporter who wrote the first explainer on QAnon for a national news outlet in 2017, told me. “I feel like, over the past three years, there have been so many moments where I thought it had reached its zenith, but it was really only just getting started.”
It’s no coincidence that a technology reporter was one of the first to identify this phenomenon — indeed, much of the best coverage of the movement has come from those steeped in understanding of social networks. QAnon is a product of the modern algorithmically powered internet (a fact that reporters flocking to cover the movement need to be mindful of).