Hacking Human Minds
A very basic problem that hinders journalists, scientists, and medical professionals from successfully fighting false information is their failure to take into account well-established facts about human perception and psychology.
there is something in the content of fake news stories and rumors that connects with heightened emotions that is absent from factual stories.
Anything that repeats a falsehood, even if it includes a rebuttal, amplifies and spreads the falsehood. It is not just the trending algorithms that are gamed by propaganda but our own minds. The researchers Soroush Vosoughi, Deb Roy, and Sinan Aral, in a widely cited study for the MIT Media Lab, investigated the spread of all of the verified true and false news stories on Twitter from 2006 to 2017. They analyzed 126,000 stories tweeted more than 4.5 million times. They compared false stories and true stories, and those containing a mixture of true and false content. They used six different fact checking organizations: snopes.com, politifact.com, factcheck.org, truthorfiction.com, hoax-slayer.com, and urbanlegends.about.com to identify rumors as either true, false, or partly true. They found that false stories travel further than true ones: a false story was 70 percent more likely to be retweeted than a true one; and it took an average of six times as long for true stories on Twitter to reach as many people as false ones. The false stories also travelled more widely.
There is something in the content of fake news stories and rumors that connects with heightened emotions that is absent from factual stories.
What is it about falsehoods that make them so viral? They seem to key into our emotions in some way that true stories don’t. The Psychologist Gordon Pennycook presented combinations of false and true news stories to people and asked them to judge each one as either true or false. He found that people in a heightened emotional state were more likely to judge false stories as true. It wasn’t only feelings of anger that impaired people’s reasoning.
False stories travel faster than true ones; it took an average of six times as long for true stories on Twitter to reach as many people as false ones. The false stories also travelled more widely.
When people feel positive emotions like inspired, strong, or proud they are as likely to misjudge fakes as those feeling negative emotions like scared, distressed, or ashamed. The level of emotion, however, did not influence people’s ability to assess real news stories. This suggests that there is something in the content of fake news stories and rumors that connects with heightened emotions that is absent from factual stories. We often assume that we are more likely to believe a false rumor if it matches our political bias. But Pennycook found heightened emotion had a much bigger effect. However, we are more likely to share a story if it matches our political beliefs, even if we are able to determine that it is false. False stories travel faster than true ones; it took an average of six times as long for true stories on Twitter to reach as many people as false ones. The false stories also travelled more widely.
False stories are more likely to trigger our instinct for novelty. Humans, and animals in general, are attuned to novelty because it often leads to opportunities and discovery. In the MIT study, the authors compared true and false rumours and found that the false ones were more novel, in terms of the information they contained. When they applied sentiment analysis they found that sharers of false stories expressed greater surprise and disgust than sharers of true rumours.
Watch: Renée DiResta: How to Beat Bad Information
Stanford University School of Engineering
Renée DiResta is research manager at the Stanford Internet Observatory, a multi-disciplinary center that focuses on abuses of information technology, particularly social media. She’s an expert in the role technology platforms and their “curatorial” algorithms play in the rise and spread of misinformation and disinformation.
In the series Social Networks
The mission of First Draft is to protect communities from harmful misinformation. Through their Cross Check program, they work with a global network of journalists to investigate and verify emerging news stories. The site has many research articles, education, and guidelines on misinformation and infodemics.
Data & Society
Data & Society studies the social implications of data-centric technologies & automation. It has a wealth of information and articles on social media and other important topics of the digital age.
Stanford Internet Observatory
The Stanford Internet Observatory is a cross-disciplinary program of research, teaching and policy engagement for the study of abuse in current information technologies, with a focus on social media.
Sinan Aral is the David Austin Professor of Management, IT, Marketing and Data Science at MIT, Director of the MIT Initiative on the Digital Economy (IDE) and a founding partner at Manifest Capital. He has done extensive research on the social and economic impacts of the digital economy, artificial intelligence, machine learning, natural language processing, social technologies like digital social networks.
Renée DiResta is the technical research manager at Stanford Internet Observatory, a cross-disciplinary program of research, teaching and policy engagement for the study of abuse in current information technologies. Renee investigates the spread of malign narratives across social networks and assists policymakers in devising responses to the problem. Renee has studied influence operations and computational propaganda in the context of pseudoscience conspiracies, terrorist activity, and state-sponsored information warfare, and has advised Congress, the State Department, and other academic, civil society, and business organizations on the topic. At the behest of SSCI, she led one of the two research teams that produced comprehensive assessments of the Internet Research Agency’s and GRU’s influence operations targeting the U.S. from 2014-2018.
The Internet’s Original Sin
Renee DiResta walks shows how the business models of the internet companies led to platforms that were designed for propaganda
“Computational Propaganda: If You Make It Trend, You Make It True”
The Yale Review
Zeynep is an associate professor at the University of North Carolina, Chapel Hill at the School of Information and Library Science, a contributing opinion writer at the New York Times, and a faculty associate at the Berkman Klein Center for Internet and Society at Harvard University. Her first book, Twitter and Tear Gas: The Power and Fragility of Networked Protest provided a firsthand account of modern protest fueled by social movements on the internet.
She writes regularly for the The New York Times and The New Yorker