irrationality

irrationality

21 bookmarks
Custom sorting
The data hinted at racism among white doctors. Then scholars looked again
The data hinted at racism among white doctors. Then scholars looked again
Science that fits the zeitgeist sometimes does not fit the data
Although the authors of the original 2020 study had controlled for various factors, they had not included very low birth weight (ie, babies born weighing less than 1,500 grams, who account for about half of infant mortality). Once this was also taken into consideration, there was no measurable difference in outcomes.
james-bcn·archive.is·
The data hinted at racism among white doctors. Then scholars looked again
Google Libros
Google Libros

The Art of Thinking Clearly Rolf Dobelli

·goodreads.com·
Google Libros
Steven Pinker: Why Smart People Believe Stupid Things
Steven Pinker: Why Smart People Believe Stupid Things
Subscribe to The Free Press: https://thefp.pub/43SoejBSteven Pinker is a world-renowned cognitive psychologist, and is widely regarded as one of the most imp...
·youtu.be·
Steven Pinker: Why Smart People Believe Stupid Things
Why People Believe Weird Things | Request PDF
Why People Believe Weird Things | Request PDF
Request PDF | On Sep 1, 2002, Michael Shermer published Why People Believe Weird Things | Find, read and cite all the research you need on ResearchGate
·researchgate.net·
Why People Believe Weird Things | Request PDF
Opinion | How A.I. Chatbots Become Political
Opinion | How A.I. Chatbots Become Political
We may soon rely a lot on A.I. chatbots, so keeping an eye on their political leanings is becoming more and more important.
·nytimes.com·
Opinion | How A.I. Chatbots Become Political
Are Smart People Ruining Democracy? | Dan Kahan | TEDxVienna
Are Smart People Ruining Democracy? | Dan Kahan | TEDxVienna
Is political polarization over the reality of climate change, the efficacy of gun control, the safety of nuclear power, and other policy-relevant facts attributable to a simple deficit in public science literacy? Dan Kahan reviews study results showing that polarization on complex factual issues rises in lockstep with culturally diverse citizens' capacity to comprehend scientific evidence generally. The talk also reviews surprising evidence about how curiosity affects polarization. More information on http://www.tedxvienna.at Dan Kahan is the Elizabeth K. Dollard Professor of Law & Professor of Psychology at Yale Law School. His primary research interests (for the moment, anyway) are risk perception, science communication, and the application of decision science to law and policymaking. He is a member of the Cultural Cognition Project, an interdisciplinary team of scholars who use empirical methods to examine the impact of group values on perceptions of risk and related facts. In studies funded by the National Science Foundation, his research has investigated public disagreement over climate change, public reactions to emerging technologies, and conflicting public impressions of scientific consensus. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx Dan Kahan is the Elizabeth K. Dollard Professor of Law & Professor of Psychology at Yale Law School. His primary research interests (for the moment, anyway) are risk perception, science communication, and the application of decision science to law and policymaking. He is a member of the Cultural Cognition Project, an interdisciplinary team of scholars who use empirical methods to examine the impact of group values on perceptions of risk and related facts. In studies funded by the National Science Foundation, his research has investigated public disagreement over climate change, public reactions to emerging technologies, and conflicting public impressions of scientific consensus. Current work of the Project is centered on integrating the methods of the science of science communication into the tool kits of professional communicators in diverse contexts ranging from local democratic decisionmaking to science-documentary filmmaking. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx
·m.youtube.com·
Are Smart People Ruining Democracy? | Dan Kahan | TEDxVienna
These are the 3 biggest emerging risks the world is facing
These are the 3 biggest emerging risks the world is facing
Risks related to the misuse of #AI are posing new threats to democracy, truth and trust in institutions. “Misinformation and disinformation” is the biggest risk. #WEF24
·weforum.org·
These are the 3 biggest emerging risks the world is facing
Taiwan’s fight against digital disinformation
Taiwan’s fight against digital disinformation
A swiss democracy journalist writes about how The citizens of Taiwan – with government support – are successfully fighting back against fake news.
·swissinfo.ch·
Taiwan’s fight against digital disinformation
The science of fake news
The science of fake news
Addressing fake news requires a multidisciplinary effort
·science.org·
The science of fake news
Misinformation is the symptom, not the disease
Misinformation is the symptom, not the disease
We are more conscious than ever of the harms of misinformation for the common good. This concern is driven by a widespread understanding of misinformation as a public disease in need of an urgent cure. Against this picture, philosopher Daniel Williams argues that misinformation is often a symptom of a deeper public malaise -- and that debunking and censorship won't be the magic bullet that we're hoping for. Since the United Kingdom’s Brexit vote and the election of Donald Trump in 2016, we have been living through an unprecedented societal panic about misinformation. Poll after poll demonstrates that the general public is highly fearful of fake news and misleading content, a concern which is widely shared among academics, journalists, and policymakers.This panic is driven by the narrative that misinformation is a kind of societal disease. Sometimes this metaphor is explicit, as with the World Health Organisation’s claim that we are living through an “infodemic” and influential research that likens misinformation to a contagious virus, but it also motivates the common diagnosis that misinformation lies at the root of many societal ills. In this analysis, ordinary individuals are routinely sucked into online rabbit holes that transform them into rabid conspiracy theorists, and misinformation is the driving force behind everything from vaccine scepticism to support for right-wing demagogues.___If misinformation is a societal disease, it should be possible to cure societies of various problems by eradicating it.___The disease model of misinformation has practical consequences. If misinformation is a societal disease, it should be possible to cure societies of various problems by eradicating it. The result is intense efforts among policymakers and companies to censor misinformation and reduce its visibility, as well as numerous initiatives that aim to cure citizens of false beliefs and reduce their “susceptibility” to them.Is the disease narrative correct? In some cases, exposure to misinformation manifestly does have harmful consequences. Powerful individuals and interest groups often propagate false and misleading messages, and such efforts are sometimes partly successful. Moreover, evidence consistently shows that the highly biased reporting of influential partisan outlets such as Fox News has a real-world impact. SUGGESTED VIEWING The freedom paradox With Patrick Haggard, Daniel Dennett, Mark Lisenmayer, Helen Steward Nevertheless, the model of misinformation as a societal disease often gets things backwards. In many cases, false or misleading information is better viewed as a symptom of societal pathologies such as institutional distrust, political sectarianism, and anti-establishment worldviews. When that is true, censorship and other interventions designed to debunk or prebunk misinformation are unlikely to be very effective and might even exacerbate the problems they aim to address.To begin with, the central intuition driving the modern misinformation panic is that people—specifically other people—are gullible and hence easily infected by bad ideas. This intuition is wrong. A large body of scientific research demonstrates that people possess sophisticated cognitive mechanisms of epistemic vigilance with which they evaluate information.___If people are not gullible and persuasion is difficult, what explains the prevalence of extraordinary popular delusions and bizarre conspiracy theories?___If anything, these mechanisms make people pig-headed, not credulous, predisposing them to reject information at odds with their pre-existing beliefs. Undervaluing other people’s opinions, they cling to their own perspective on the world and often dismiss the claims advanced by others. Persuasion is therefore extremely difficult and even intense propaganda campaigns and advertising efforts routinely have minimal effects.To many commentators, these findings are difficult to accept. If people are not gullible and persuasion is difficult, what explains the prevalence of extraordinary popular delusions and bizarre conspiracy theories? This question embodies a widespread but confused assumption, however: that the truth is always self-evident and desirable, such that false beliefs can only be explained by the credulous acceptance of misinformation.First, the truth about complex and often distant states of affairs is not self-evident. In forming beliefs, citizens rely on interpretive dispositions and intuitions that are not well-aligned with truth or contemporary scientific consensus. Indeed, the very reason that we need science and expertise is precisely because the truth is often highly counter-intuitive. When it comes to topics as diverse as vaccines, nuclear power, GMOs, and the nature of complex, modern societies, most of us therefore start with pre-scientific intuitions. For example, many people’s intuitive sense of disgust is activated at the thought of being injected with (what they imagine to be) a live disease, and a deeply entrenched omission bias causes people to fear the consequences of being vaccinated (an act of commission) more than the consequences of not being vaccinated (an act of omission).To overcome such intuitions, people must encounter and accept reliable information. Of course, defining what constitutes reliable information is as challenging as defining misinformation. Fallibility, bias, and error are ineliminable features of the human condition, including within our leading epistemic institutions. Nevertheless, precisely because modern science implements procedures designed to overcome human frailties and biases, such as peer review, open debate, and distinctive social norms, consensus views among diverse experts tend to be broadly reliable. Similarly, even their critics acknowledge that mainstream media outlets in democratic societies that adhere to norms of journalistic objectivity (e.g., fact-checking, balance, and accountability) tend to be mostly reliable when it comes to reporting on narrow matters of fact.Unfortunately, not only do most citizens not pay much attention to politics or the news, but a minority actively distrust institutions such as modern science, public health authorities, and mainstream media. The causes of this distrust are complex and diverse. They include psychological traits that predispose some people towards paranoid worldviews; institutional failures, such as telling noble lies to manage public behaviour and dismissing legitimate ideas as conspiracy theories; and feelings—often justified—of exclusion from positions of power and influence. Whatever its causes, however, such distrust often drives people to seek out information—commonly misinformation—from counter-establishment sources and reject information from mainstream ones. SUGGESTED READING Could Fake News Create Fake Memories? By Dean Burnett Second, the truth is not always desirable, nor easy to accept. When it comes to domains such as politics and culture, human beings are not disinterested truth seekers. The competing sides in political debates and culture wars often behave more like warring religious sects than groups organised around coherent worldviews. Their members embrace beliefs and narratives that signal their tribal allegiances, cast their group in a favourable light, and derogate their rivals and enemies. Similarly, just as those in power often seek to embrace worldviews that affirm and rationalise their superiority, members of the general public who despise “elites” and the “establishment” are often eager to embrace narratives that demonise them, sometimes in the most extreme way possible (e.g., by casting them as Satanic paedophiles).___The result is a marketplace of rationalisations that rewards the production and dissemination of content that supports favoured narratives in society.___These motivations to embrace biased beliefs cause people to seek out belief-justifying information. The result is a marketplace of rationalisations that rewards the production and dissemination of content that supports favoured narratives in society. We tend to view the super-spreaders of misinformation as master manipulators, orchestrating mass delusion from their keyboards and podcast appearances, but they are often better understood as entrepreneurs who use their rhetorical skills to affirm and justify in-demand beliefs in exchange for social and financial rewards. Beyond Merchants of Doubt they are merchants of affirmation, and for the right price, they'll validate and rationalise anything.The importance of factors such as institutional distrust, polarisation, and rationalisation markets implies a very different picture of misinformation, one in which it looks less like a disease than a mirror reflecting deeper societal pathologies. As the legal scholar Dan Kahan puts it on this picture “misinformation is not something that happens to the mass public but rather something its members are complicit in producing.”There is considerable evidence for this analysis. First, although some experimental evidence suggests that people can be persuaded to abandon false beliefs, such interventions rarely cause people to change more basic attitudes, such as voting or vaccination intentions, suggesting that consuming misinformation often serves to rationalise pre-existing inclinations rather than cause them.Second, as with political media generally, misinformation largely preaches to the choir. For example, people consume and spread political misinformation that supports their favoured groups and causes, and members of online conspiratorial communities are not a cross-section of the population but people with specific motivations, identities, and predispositions. The consumers of misinformation are therefore rarely passive victims of false information; they actively seek out and engage with biased content and ...
·iai.tv·
Misinformation is the symptom, not the disease