Log In


Reset Password
  • MENU
    Editorials
    Thursday, April 18, 2024

    Facebook's more robust standards

    This appeared in The (New York) Daily News.

    While the world continues to battle a biological pathogen, two dangerous information viruses just got a big, albeit belated, setback: Facebook has announced it will no longer tolerate QAnon conspiracy theories, or people denying the historical reality of the Holocaust of 6 million Jews, or paid advertisements from anti-vaxxer groups.

    The moves are welcome signs that the online community, with nearly 3 billion members, is finally taking more seriously its responsibility to exercise its judgment and deem some ideas so false and hateful and toxic, they have no business helping them spread.

    Over the years, Mark Zuckerberg's company has stumbled repeatedly in trying to determine when it should be a self-regulating free-for-all and what kinds of communications it should seek to root out from the top down. Pornography and extreme gore have long been banned, at least in theory; so too, terrorist incitement.

    The thorniest section of the briar patch regards truth and lies. We wouldn't want Facebook to start deciding, for instance, that astrologers or Scientologists or Santa Claus enthusiasts can't trade content, or to require a five-person team to prescreen your claim that you've just grown the biggest tomato in the county.

    So too, the vast majority of disagreements between politicians should be hashed out between candidates and campaigns. If Donald Trump says that Dr. Tony Fauci used to staunchly oppose mask-wearing or Joe Biden says the trade deficit with China is higher than it was before — both misleading or false claims — it's not for Facebook to play arbiter. With hundreds of debatable claims in thousands of races, that would untenably hinder speech.

    But lies designed to undermine the election itself should be verboten. And when the Trump campaign circulated an ad with a false image of Biden wearing an earpiece, Facebook should have swiftly struck it. Deep-fake videos of either candidate must also be disallowed, especially when Facebook is profiting off promoting them.

    Not all lies are created equal. Algorithms are horrible at telling the difference. Humans at Facebook must. 

    Comment threads are monitored for 48 hours after publication and then closed.