You Can’t Not Believe Everything You Read
“If you tell a lie big enough and keep repeating it, people will eventually come to believe it.” —Joseph Goebbels, Minister of Propaganda for Nazi Government for the Third Reich
You Can’t Not Believe Everything You Read
Marking fake news stories as “disputed” had no effect on the illusory truth.
Propagandists have long known that we are more likely to believe something simply as a result of repeated exposure to it. It is one of the fundamental mechanisms of conditioning people and organisms, often termed “learning.” This effect has been replicated in multiple experiments. But why would we give credence to something that we know is false, rather than continue to reject it? One of the underlying assumptions, that stems from the work of the philosopher Rene Descartes, is that we begin by understanding something—an idea or an assertion—and then based on whether it has compelling arguments or evidence we decide whether to believe it.
The psychologist Daniel Gilbert takes us, instead, to another philosopher of that era who had a very different idea of how we comprehend and assess what we hear. Baruch Spinoza argued that in order to understand a statement, you had to first believe it—at least a little bit. Only then could you decide whether you actually believe it, based on your existing knowledge and logical consideration. In Gilbert’s summary, belief is first, easy, and inexorable, whereas doubt is retroactive, difficult, and only occasionally successful. The “illusory truth effect”, as it is called today, is widely supported by research. Gordon Pennycook found when people were exposed to a fake new story, they were more likely to believe it when they saw it again (except for extremely implausible assertions, such as: The Earth is a perfect square). This effect was present even if the fake news went counter to their political beliefs. Marking fake news stories as “disputed” had no effect on the illusory truth effect.
Hacking Identity
The national, ethnic, religious, and other identities can be a source of joy and celebration. But the propagandist’s task is to sow division and fear within society as whole through these differences.
One of the oldest techniques used by propagandists is to manipulate people through their sense of shared identity. This consists of a two-pronged effort. The one promotes pride in the identity itself, while the other raises threats to that identify from (supposed) outsiders. Our national, ethnic, religious, and other identities can be a source of joy and celebration. But the propagandist’s task is to sow division and fear within society as a whole through these differences.
The Breakdown of Trust
The external environment—what is happening in the world—can also make our minds susceptible to hacking. The violence and fragmentations of collective sensemaking that occurred in the past were never solely due to the arrival of new communication technology. When the printing press was first invented in China, somewhere in the first century CE, it was not followed by 30 years of religious violence as it was in Europe. When Luther printed his ninety-five theses in 1517 CE it was in response to long brewing, widely held distrust and resentment over corruption in the Catholic Church. The pamphlets that he and other reformers printed and distributed spoke to what people were already seeing and feeling about the church. Similarly, the targeted messages on social media connect with a growing sense of distrust and political polarization.
Corruption in the systems we once trusted has done great harm to society, but political polarization has been the most damaging factor.
The Knight Foundation on Trust, Media and Democracy identified a set of global trends that have undermined the public’s trust in governments, big business, and media. There is a growing sense of inequity, especially in the developed world, coupled with the perception that the government is unwilling or unable to address these problems. The 2008 financial meltdown was especially damaging, since the buy-out of the big banks did not include any reform. Although global poverty has decreased enormously in many regions that used to be poor, the middle class, especially in the United States, has experienced declining economic mobility while the super-rich are have gained unprecedented increases in wealth.
In his Medium article, How to Destroy Surveillance Capitalism, Cory Doctorow invites us to consider the possibility that the reason so many people are vulnerable to conspiracy theories is because they are living through real conspiracies that are happening around them.
What if the trauma of living through real conspiracies all around us — conspiracies among wealthy people, their lobbyists, and lawmakers to bury inconvenient facts and evidence of wrongdoing (these conspiracies are commonly known as “corruption”) — is making people vulnerable to conspiracy theories?
Corruption in the systems we once trusted has done great harm to society, but political polarization has been the most damaging factor. It leads to government gridlock: laws and budgets do not get passed or when they do it is with the opposition of half the country. This leads to further distrust of governments. Polarized politics fosters a sense of outrage, which makes us more likely to share stories that feed that outrage.
In the series: Social Networks
Related articles:
Further Reading »
Websites:
Data & Society
Data & Society studies the social implications of data-centric technologies & automation. It has a wealth of information and articles on social media and other important topics of the digital age.
Stanford Internet Observatory
The Stanford Internet Observatory is a cross-disciplinary program of research, teaching and policy engagement for the study of abuse in current information technologies, with a focus on social media.
Profiles:
Sinan Aral
Sinan Aral is the David Austin Professor of Management, IT, Marketing and Data Science at MIT, Director of the MIT Initiative on the Digital Economy (IDE) and a founding partner at Manifest Capital. He has done extensive research on the social and economic impacts of the digital economy, artificial intelligence, machine learning, natural language processing, social technologies like digital social networks.
Renée DirResta
Renée DiResta is the technical research manager at Stanford Internet Observatory, a cross-disciplinary program of research, teaching and policy engagement for the study of abuse in current information technologies. Renee investigates the spread of malign narratives across social networks and assists policymakers in devising responses to the problem. Renee has studied influence operations and computational propaganda in the context of pseudoscience conspiracies, terrorist activity, and state-sponsored information warfare, and has advised Congress, the State Department, and other academic, civil society, and business organizations on the topic. At the behest of SSCI, she led one of the two research teams that produced comprehensive assessments of the Internet Research Agency’s and GRU’s influence operations targeting the U.S. from 2014-2018.
YouTube talks:
The Internet’s Original Sin
Renee DiResta walks shows how the business models of the internet companies led to platforms that were designed for propaganda
Articles:
Computational Propaganda
“Computational Propaganda: If You Make It Trend, You Make It True”
The Yale Review
Claire Wardle
Dr. Claire Wardle is the co-founder and leader of First Draft, the world’s foremost non-profit focused on research and practice to address mis- and disinformation.
Zeynep Tufekci
Zeynep is an associate professor at the University of North Carolina, Chapel Hill at the School of Information and Library Science, a contributing opinion writer at the New York Times, and a faculty associate at the Berkman Klein Center for Internet and Society at Harvard University. Her first book, Twitter and Tear Gas: The Power and Fragility of Networked Protest provided a firsthand account of modern protest fueled by social movements on the internet.
She writes regularly for the The New York Times and The New Yorker
TED Talk:
WATCH: We’re building a dystopia just to make people click on ads
External Stories and Videos
Watch: Renée DiResta: How to Beat Bad Information
Stanford University School of Engineering
Renée DiResta is research manager at the Stanford Internet Observatory, a multi-disciplinary center that focuses on abuses of information technology, particularly social media. She’s an expert in the role technology platforms and their “curatorial” algorithms play in the rise and spread of misinformation and disinformation.