Meta surrenders to the right on speech
Alexios Mantzarlis, the founding director of the International Fact-Checking Network, worked closely with Meta as the company set up its partnerships. He took exception on Tuesday to Zuckerberg's statement that "the fact-checkers have just been too politically biased, and have destroyed more trust than they've created, especially in the US."
What Zuckerberg called bias is a reflection of the fact that the right shares more misinformation from the left, said Mantzarlis, now the director of the Security, Trust, and Safety Initiative at Cornell Tech.
"He chose to ignore research that shows that politically asymmetric interventions against misinformation can result from politically asymmetric sharing of misinformation," Mantzarlis said. "He chose to ignore that a large chunk of the content fact-checkers are flagging is likely not political in nature, but low-quality spammy clickbait that his platforms have commodified. He chose to ignore research that shows Community Notes users are very much motivated by partisan motives and tend to over-target their political opponents."
while Community Notes has shown some promise on X, a former Twitter executive reminded me today that volunteer content moderation has its limits. Community Notes rarely appear on content outside the United States, and often take longer to appear on viral posts than traditional fact checks. There is also little to no empirical evidence that Community Notes are effective at harm reduction.
Another wrinkle: many Community Notes currently cite as evidence fact-checks created by the fact-checking organizations that Meta just canceled all funding for.
What Zuckerberg is saying is that it will now be up to users to do what automated systems were doing before — a giant step backward for a person who prides himself on having among the world's most advanced AI systems.
"I can't tell you how much harm comes from non-illegal but harmful content," a longtime former trust and safety employee at the company told me. The classifiers that the company is now switching off meaningfully reduced the spread of hate movements on Meta's platforms, they said. "This is not the climate change debate, or pro-life vs. pro-choice. This is degrading, horrible content that leads to violence and that has the intent to harm other people."