Found 12 bookmarks
Newest
Meta surrenders to the right on speech
Meta surrenders to the right on speech
Alexios Mantzarlis, the founding director of the International Fact-Checking Network, worked closely with Meta as the company set up its partnerships. He took exception on Tuesday to Zuckerberg's statement that "the fact-checkers have just been too politically biased, and have destroyed more trust than they've created, especially in the US." What Zuckerberg called bias is a reflection of the fact that the right shares more misinformation from the left, said Mantzarlis, now the director of the Security, Trust, and Safety Initiative at Cornell Tech. "He chose to ignore research that shows that politically asymmetric interventions against misinformation can result from politically asymmetric sharing of misinformation," Mantzarlis said. "He chose to ignore that a large chunk of the content fact-checkers are flagging is likely not political in nature, but low-quality spammy clickbait that his platforms have commodified. He chose to ignore research that shows Community Notes users are very much motivated by partisan motives and tend to over-target their political opponents."
while Community Notes has shown some promise on X, a former Twitter executive reminded me today that volunteer content moderation has its limits. Community Notes rarely appear on content outside the United States, and often take longer to appear on viral posts than traditional fact checks. There is also little to no empirical evidence that Community Notes are effective at harm reduction. Another wrinkle: many Community Notes currently cite as evidence fact-checks created by the fact-checking organizations that Meta just canceled all funding for.
What Zuckerberg is saying is that it will now be up to users to do what automated systems were doing before — a giant step backward for a person who prides himself on having among the world's most advanced AI systems.
"I can't tell you how much harm comes from non-illegal but harmful content," a longtime former trust and safety employee at the company told me. The classifiers that the company is now switching off meaningfully reduced the spread of hate movements on Meta's platforms, they said. "This is not the climate change debate, or pro-life vs. pro-choice. This is degrading, horrible content that leads to violence and that has the intent to harm other people."
·platformer.news·
Meta surrenders to the right on speech
Zuckerberg officially gives up
Zuckerberg officially gives up
I floated a theory of mine to Atlantic writer Charlie Warzel on this week’s episode of Panic World that content moderation, as we’ve understood, it effectively ended on January 6th, 2021. You can listen to the whole episode here, but the way I look at it is that the Insurrection was the first time Americans could truly see the radicalizing effects of algorithmic platforms like Facebook and YouTube that other parts of the world, particularly the Global South, had dealt with for years. A moment of political violence Silicon Valley could no longer ignore or obfuscate the way it had with similar incidents in countries like Myanmar, India, Ethiopia, or Brazil. And once faced with the cold, hard truth of what their platforms had been facilitating, companies like Google and Meta, at least internally, accepted that they would never be able to moderate them at scale. And so they just stopped.
After 2021, the major tech platforms we’ve relied on since the 2010s could no longer pretend that they would ever be able to properly manage the amount of users, the amount of content, the amount of influence they “need” to exist at the size they “need” to exist at to make the amount of money they “need” to exist.
Under Zuckerberg’s new “censorship”-free plan, Meta’s social networks will immediately fill up with hatred and harassment. Which will make a fertile ground for terrorism and extremism. Scams and spam will clog comments and direct messages. And illicit content, like non-consensual sexual material, will proliferate in private corners of networks like group messages and private Groups. Algorithms will mindlessly spread this slop, boosted by the loudest, dumbest, most reactionary users on the platform, helping it evolve and metastasize into darker, stickier social movements. And the network will effectively break down. But Meta is betting that the average user won’t care or notice. AI profiles will like their posts, comment on them, and even make content for them. A feedback loop of nonsense and violence. Our worst, unmoderated impulses, shared by algorithm and reaffirmed by AI. Where nothing has to be true and everything is popular.
·garbageday.email·
Zuckerberg officially gives up
How Elon Musk Got Tangled Up in Blue
How Elon Musk Got Tangled Up in Blue
Mr. Musk had largely come to peace with a price of $100 a year for Blue. But during one meeting to discuss pricing, his top assistant, Jehn Balajadia, felt compelled to speak up. “There’s a lot of people who can’t even buy gas right now,” she said, according to two people in attendance. It was hard to see how any of those people would pony up $100 on the spot for a social media status symbol. Mr. Musk paused to think. “You know, like, what do people pay for Starbucks?” he asked. “Like $8?” Before anyone could raise objections, he whipped out his phone to set his word in stone. “Twitter’s current lords & peasants system for who has or doesn’t have a blue checkmark is bullshit,” he tweeted on Nov. 1. “Power to the people! Blue for $8/month.”
·nytimes.com·
How Elon Musk Got Tangled Up in Blue
On Free Speech and Cancel Culture, Letter Four
On Free Speech and Cancel Culture, Letter Four
“No Campus Free Speech Controversies at the Vast Majority of Colleges This Week” isn’t a headline that can exist.
I think that the difficulty in top-down moderation means that platforms have a great responsibility to provide users with tools to block, mute, go private, and avoid certain terms and topics.
Most of my own readers are disillusioned leftists and liberals, but certainly I host many conservatives here, and I’m fine with that. And, yes, it’s entirely possible for the anti-woke beat to become a shtick, and because there are financial incentives involved, for writers to dedicate more and more time to it. That in turn can provoke people to fixate on problems with language norms or minor culture war kerfuffles, to the detriment of bigger issues of greater intrinsic concern to the country.
Woke and anti-woke are not the same in a simplistic way, but it’s true that they’re caught in a mutually-reinforcing cycle.
Honestly, I’ve never thought of myself as a contrarian leftist at all; I just think of myself as an old-school materialist and civil libertarian leftist who’s unhappy with the evolution of contemporary liberalism. It’s perfectly fair, though, to argue that my priorities are off and that I spend too much time worrying over liberal culture than about structural injustice.
When I complain that there’s a strain of liberal historiography that seems to deny that people of color have ever had agency, and in doing so makes white people the protagonists of history, that doesn’t seem anti-woke to me; it seems to be an argument for a more expansive vision of what respect for people of color entails.
I think the woke/anti-woke binary is a dead end. Everyone has already taken their places on the stage, and the back-and-forth that exists feels tired and rehearsed. I am 100% open to the idea that the discursive and language controversies I talk about so often are of less importance to deeper issues of structural politics. I might have lost the plot. But as social justice politics have become the language of institutions, albeit opportunistically on the part of those institutions, the need for a vibrant counternarrative has only grown. I think for all of its pitfalls and susceptibility to corruption, “anti-woke” discourse is profoundly necessary. Critical thinking about cancel culture is necessary. A world where Goldman Sachs flies Pride flags outside its offices is a world where left-wing skepticism of woke morals is needed.
·freddiedeboer.substack.com·
On Free Speech and Cancel Culture, Letter Four
‘Silicon Values’
‘Silicon Values’
York points to a 1946 U.S. Supreme Court decision, Marsh v. Alabama, which held that private entities can become sufficiently large and public to require them to be subject to the same Constitutional constraints as government entities. Though York says this ruling has “not as of this writing been applied to the quasi-public spaces of the internet”
even if YouTube were treated as an extension of government due to its size and required to retain every non-criminal video uploaded to its service, it would make as much of a political statement elsewhere, if not more. In France and Germany, it — like any other company — must comply with laws that require the removal of hate speech, laws which in the U.S. would be unconstitutional
Several European countries have banned Google Analytics because it is impossible for their citizens to be protected against surveillance by American intelligence agencies.
TikTok has downplayed the seriousness of its platform by framing it as an entertainment venue. As with other platforms, disinformation on TikTok spreads and multiplies. These factors may have an effect on how people vote. But the sudden alarm over yet-unproved allegations of algorithmic meddling in TikTok to boost Chinese interests is laughable to those of us who have been at the mercy of American-created algorithms despite living elsewhere. American state actors have also taken advantage of the popularity of social networks in ways not dissimilar from political adversaries.
what York notes is how aligned platforms are with the biases of upper-class white Americans; not coincidentally, the boards and executive teams of these companies are dominated by people matching that description.
It should not be so easy to point to similarities in egregious behaviour; corruption of legal processes should not be so common. I worry that regulators in China and the U.S. will spend so much time negotiating which of them gets to treat the internet as their domain while the rest of us get steamrolled by policies that maximize their self-preferencing.
to ensure a clear set of values projected into the world. One way to achieve that is to prefer protocols over platforms.
This links up with Ben Thompson’s idea about splitting twitter into a protocol company and a social media company
Yes, the country’s light touch approach to regulation and generous support of its tech industry has brought the world many of its most popular products and services. But it should not be assumed that we must rely on these companies built in the context of middle- and upper-class America.
·pxlnv.com·
‘Silicon Values’
To Thrive, Our Democracy Needs Digital Public Infrastructure
To Thrive, Our Democracy Needs Digital Public Infrastructure
Facebook, Twitter and YouTube each took first steps to rein in the worst behavior on their platforms in the heat of the election, but none have confronted how their spaces were structured to become ideal venues for outrage and incitement.
The first step in the process is realizing that the problems we’re experiencing in digital life — how to gather strangers together in public in ways that make it so people generally behave themselves — aren’t new. They’re problems that physical communities have wrestled with for centuries. In physical communities, businesses play a critical role — but so do public libraries, schools, parks and roads. These spaces are often the groundwork that private industry builds itself around: Schools teach and train the next generation of workers; new public parks and plazas often spur private real estate development; businesses transport goods on publicly funded roads; and so on. Public spaces and private industry work symbiotically, if sometimes imperfectly.
These kinds of public spaces mostly don’t exist online. Twitter, Facebook, YouTube and Twitch each offer some aspects of these experiences. But ultimately, they’re all organized around the need for growth and revenue — incentives which are in tension with the critical community functions these institutions also serve, and with the heavy staffing models they require.
Recent peer-reviewed research from three professors at the University of Virginia demonstrates how dramatically the design of platforms can affect how people behave on them. In their study, in months where conservative-leaning users visited Facebook more, they saw much more ideological content than normal, whereas in months where they visited Reddit more they “read news that was 50 percent more moderate than what they typically read.” (This effect was smaller but similar for political liberals). Same people, different platforms, and dramatically different news diets as a result.
Wikipedia is probably the best-known example of this kind of institution — a nonprofit, mission-driven piece of digital infrastructure. The nonprofit Internet Archive, which bills itself as a free “digital library,” a repository of books, movies and music and over 500 billion archived webpages to create a living history of the internet, is another. But what we need are not just information services with a mission-driven agenda, but spaces where people can talk, share and relate without those relationships being distorted and shaped by profit-seeking incentive structures.
Users can post only once a day, every post is read by a moderating team, and if you’re too salty or run afoul of other norms, you’re encouraged to rewrite. This is terrible for short-term engagement — flame wars drive attention and use, after all — and as a business model, all those moderators are costly. But there’s a long-term payoff: two-thirds of Vermont households are on the Forum, and many Vermonters find it a valuable place for thoughtful public discussions.
In fact, public digital infrastructures might be the right place to start exploring how to reinvent governance and civil society more broadly.
If mission, design and governance are important ingredients, the final component is what might be called digital essential workers — professionals like librarians whose job is to manage, steward, and care for the people in these spaces. This care work is one of the pillars of successful physical communities, which has been abstracted away by the existing tech platforms. S
The truth is that Facebook, Google and Twitter have displaced and sucked the revenue out of an entire ecosystem of local journalistic enterprises and other institutions that served some of these public functions.
·politico.com·
To Thrive, Our Democracy Needs Digital Public Infrastructure