Log In


Reset Password
  • MENU
    Nation
    Monday, April 29, 2024

    Trump's coronavirus test sparks QAnon misinformation spree

    Adherents of QAnon, the vast conspiracy theory that baselessly claims that a satanic cabal of high-profile liberals runs a global human trafficking operation, are used to scouring the headlines for items of news they can point to as evidence they're on to something. Social media and communications companies are used to watching those claims spread across their platforms in real time.

    As soon as President Donald Trump announced he had tested positive for the coronavirus, both sprang into action.

    QAnon believers falsely distorted the news, saying the president is pretending to go into quarantine as part of a grand plan to take down the alleged human trafficking cabal. Trump has said he does not know much about the QAnon phenomenon but has appeared to condone its supporters, saying they are people who "love America" and "like me very much."

    YouTube and Facebook both said they immediately began monitoring for coronavirus diagnosis-related misinformation after Trump announced his positive test and that of first lady Melania Trump.

    "Within minutes of their diagnosis being made public, our systems began surfacing authoritative news sources on our homepage, as well as in search results and watch next panels regarding the President and COVID-19," YouTube spokesperson Alex Joseph said in a statement.

    A Facebook spokesperson, who declined to be named because the situation is "rapidly evolving," said in an email that the company is tracking the spread of conspiracy theories and will work to fact check and label misleading content.

    The company said it would also remove content that violates policy, such as calls for death and celebration and claims that the election is being canceled or postponed.

    Twitter did not outline new efforts to contain conspiracy theories around Trump's diagnosis.

    "Using a combination of technology and human review, our teams have taken steps to address coordinated attempts to spread harmful misinformation around COVID-19. This applies today too," Twitter spokeswoman Liz Kelley said in a statement.

    Taking action on content that breaks rules, including spam and content that expresses a desire for death, serious bodily harm or fatal disease is part of ongoing work to protect public conversation, she said.

    Platforms have long been under fire for allowing false information and discriminatory ideologies to spread on their platforms. In recent months, they've been under pressure to more comprehensively tackle white supremacist content as well as COVID-19 and election-related disinformation.

    Facebook has come up short in its attempts to contain QAnon content, which burst into the mainstream earlier this year.

    In August Facebook said it removed 790 QAnon groups and restricted an additional 1,950 related to the conspiracy. Since then, a QAnon Facebook group added hundreds of new followers, and the company's own algorithm recommended users to groups discussing the theory, The New York Times found.

    In July, Twitter said it was removing thousands of QAnon accounts, but many returned within weeks, according to The New York Times investigation.

    Reddit did not immediately respond to an inquiry about what type of Trump coronavirus conspiracy theories and other harmful activity has circulated on the platform.

    Comment threads are monitored for 48 hours after publication and then closed.