Multiple studies show old and new trends in social media "are making Americans more susceptible to the rumors and lies that undermine democracies and their militaries"
AI Warfare: How Foreign Powers Are Targeting the U.S. Election
An attempted assassination and the conspiratorial disinformation that followed
(Digital violence is another term that has been rising. This used to refer to bullying, but could now be applied to the rhetoric of conflict.)
Top 10 Generative AI Models Mimic Russian Disinformation Claims A Third of the Time, Citing Moscow-Created Fake Local News Sites as Authoritative Sources - NewsGuard
‘Convergence of Anger’ Drives Disinformation Around E.U. Elections
OpenAI Says Russia and China Used Its A.I. in Covert Campaigns
How China is using AI news anchors to deliver its propaganda
Special Paper: Current safeguards, risk mitigation, and transparency measures of large language models against the generation of health disinformation: repeated cross sectional analysis
Poe
Deepfake fears intensify as Google CEO sounds alarm on AI misinformation
Anthropic takes steps to prevent election misinformation | TechCrunch
Opinion | A.I. Is Coming for the Past, Too
Global Risks Report 2024 | World Economic Forum
Forget technology — politicians pose the gravest misinformation threat
This is set to be a big election year, including in India, Mexico, the US, and probably the UK. People will rightly be on their guard for misinformation, but much of the policy discussion on the topic ignores the most important source: members of the political elite.As a social scientist working on political communication, I have spent years in these debates — which continue to be remarkably disconnected from what we know from research. Academic findings repeatedly underline the actual impact of politics, while policy documents focus persistently on the possible impact of new technologies.Most recently, Britain’s National Cyber Security Centre (NCSC) has warned of how “AI-created hyper-realistic bots will make the spread of disinformation easier and the manipulation of media for use in deepfake campaigns will likely become more advanced”. This is similar to warnings from many other public authorities, which ignore the misinformation from the most senior levels of domestic politics. In the US, the Washington Post stopped counting after documenting at least 30,573 false or misleading claims made by Donald Trump as president. In the UK, the non-profit FullFact has reported that as many as 50 MPs — including two prime ministers, cabinet ministers and shadow cabinet ministers — failed to correct false, unevidenced or misleading claims in 2022 alone, despite repeated calls to do so.These are actual problems of misinformation, and the phenomenon is not new. Both George W Bush and Barack Obama’s administrations obfuscated on Afghanistan. Bush’s government and that of his UK counterpart Tony Blair advanced false and misleading claims in the run-up to the Iraq war. Prominent politicians have, over the years, denied the reality of human-induced climate change, proposed quack remedies for Covid-19, and so much more. These are examples of misinformation, and, at their most egregious, of disinformation — defined as spreading false or misleading information for political advantage or profit.This basic point is strikingly absent from many policy documents — the NCSC report, for example, has nothing to say about domestic politics. It is not alone. Take the US Surgeon General’s 2021 advisory on confronting health misinformation which calls for a “whole-of-society” approach — and yet contains nothing on politicians and curiously omits the many misleading claims made by the sitting president during the pandemic, including touting hydroxychloroquine as a potential treatment.This oversight is problematic because misinformation coming from the top is likely to have a far greater impact than that from most other sources, whether social media posts by ordinary people, hostile actors, or commercial scammers. People pay more attention to what prominent politicians say, and supporters of those politicians are more inclined to believe it and act on it.We know this from years of research. Millions of Americans believed there was systematic voter fraud in the 2020 elections, that weapons of mass destruction were found in Iraq, that human activity played little role in climate change, and that the risks and side effects of Covid-19 vaccines outweighed the health benefits. What all these misleading beliefs have in common is that they have been systematically advanced by political actors — by the right in the US. But in, for example, Mexico, there is plenty of misinformation coming from the left. Meanwhile, the policy discussion remains bogged down with how to police AI-generated content, while distracting us from how some politicians — perhaps conscious of how tech companies eventually blocked Trump in the dying days of his presidency — are pushing for legal exemptions from content moderation.Of course there will be examples of AI-generated misinformation, bots, and deepfakes during various elections next year. But the key question is how politicians will be using these tools. A pro-Ron DeSantis political action committee has already used an AI version of Trump’s voice in a campaign ad. This is not some unnamed “malicious actor”, but a team working on behalf of the governor of a state with a population larger than all but five EU member states. We have seen examples of similar activity in elections in Argentina and New Zealand too.When it comes to the most serious misinformation, the calls tend to come from inside the house. Technology will not change that, so let’s stop gaslighting the public and admit clearly as we head into a big election year that misinformation often comes from the top.
Worried About Deepfakes? Don’t Forget “Cheapfakes”
“Political ads are deliberately designed to shape your emotions and influence you. So, the culture of political ads is often to do things that stretch the dimensions of how someone said something, cut a quote that's placed out of context,” says Gregory. “That is essentially, in some ways, like a cheap fake or shallow fake.”
Eric Schmidt has a 6-point plan for fighting election misinformation
How generative AI is boosting the spread of disinformation and propaganda
Forecasting Potential Misuses of Language Models for Disinformation Campaigns—and How to Reduce Risk
Russia Reactivates Its Trolls and Bots Ahead of Tuesday’s Midterms
JFK Foresaw Donald Trump’s America
Head of FDA Says Misinformation Is Now the Leading Cause of Death
Free speech concerns mount over DHS 'disinformation' board as lawmakers, critics weigh in
Report Launch: The Future of the U.S.-U.K. Intelligence Alliance
Beware the Never-Ending Disinformation Emergency
U.S. Army Techniques Publication: Chinese Tactics
Biden Has to Play Hardball with Internet Platforms
Why Misinformation Is About Who You Trust, Not What You Think