More than one month since presidential elections were held in the U.S., incumbent President Donald Trump has refused to concede his loss to Joe Biden despite the fact that the latter has clearly secured 306 Electoral College votes (to Mr. Trump’s 232) and 51.3% of the popular vote (against 46.9%).
Mr. Trump continues to push conspiracy theories of a rigged election and has sought to overturn verdicts in several battleground States by trying to nullify postal ballots. Despite courts across these States dismissing more than three dozen lawsuits filed by the Trump campaign and facts flying at the face of obvious disinformation and lying fed by them, opinion polls estimate that three fourths of Republican voters now believe that Mr. Biden won because of voter fraud.
COVID-19 is not the only viral threat that has engulfed the U.S.; the misinformation “infodemic” — deliberate feeding of fact-free rhetoric without any basis in reality and its spread through social media and messaging platforms, amplified by demagogic politicians and public officials for political gains — has become a great threat to its democracy. No other phenomenon exemplifies the scourge of misinformation more in the U.S. than the far-right, fringe conspiracy theory called, ‘QAnon’ that emerged as anonymous postings on the frivolity laden message board “4chan” in 2017 and since then spread as wildfire across social media platforms.
The crazy theory behind QAnon is that President Trump’s administration was leading a crusade against a ring of cannibal paedophiles, including Democratic Party leaders and celebrities. And that there was a Trump administration insider, ‘Q’, who was feeding information about this crusade and everything under the presidency was “under control”, in contrast to the chaos that was being depicted by mainstream media.
This group has since its origins become an extremist network on the Internet, spreading rampant misinformation about politics and the COVID-19 response, in particular about mask wearing and the lockdown. Two Republican Congressional House candidates Marjorie Taylor Greene and Lauren Boebert, who have endorsed the QAnon theories, have won elections — in other words, the outlandish fringe is now mainstream in some parts of the U.S.
Meanwhile, some QAnon believers had moved from hateful rhetoric on the internet to hate crimes such as plotting to plant a bomb in the Illinois State Capitol in 2018, using an armoured car to block traffic on a bridge in 2018, vandalism in a Catholic Church in Arizona in 2019, among others. This led to a bipartisan resolution that passed in the House of Representatives in October 2020, condemning QAnon, and urged Americans to seek information from authoritative sources. Mr. Trump, on the other hand, acknowledged the support from this “movement” and did nothing to disavow its base claims.
The viraling of misinformation has led to social media platforms seeking to regulate and ban QAnon groups and posts on them — an analysis by Facebook found that the major QAnon groups and pages had more than 3 million members in August 2020. The social network then sought to aggressively take down content promoted by QAnon groups after initial attempts to limit them by taking recourse to people reporting them failed to contain them as the conspiracy-mongering spread out of control. QAnon message administrators have since then sought to regroup in other message boards, but there has been a lull in activity since Mr. Trump’s loss in the election, something that was unimaginable for QAnoners for whom he was always “in control”.
Companies like Facebook, which has more than 2.7 billion users worldwide, have long had an “organic content” problem, which is also shared by other ‘BigTech’ websites such as Twitter and Google-owned Youtube. BigTech algorithms on social media are designed to connect people by grouping them based on similar likes and interests and to amplify “viral content”. Keeping people online is key to revenues for social media companies and an algorithm approach that feeds cognitive biases (likes and interests) is a clear outcome of this strategy. Algorithms seek to promote, on user feeds, content that “engages” their attention, producing a cascading effect on information sharing that has little to do with credibility or accuracy and more with reinforcement of beliefs within echo chambers.
This is how groups based on conspiracy theories such as QAnon achieved acquired potency for misinformation, leaving platforms like Facebook and Twitter to scramble for ways to regulate them much after they had become popular. As the U.S. continues to experience a high daily COVID-19 death toll of more than 2,000, it is now clear that the pandemic and the “infodemic” have been interlinked and a coherent response to the former will have to include one for the latter.