We have created a world in which information flows more freely and efficiently than ever before. And yet we are finding it harder to understand one another and to agree on basic facts about reality. For some, it is an opportunity to deliberately manipulate the perceptions of others for their own benefit.
Social media applications were intended to foster personal expression: to publish a home video on YouTube, share what you are feeling on Facebook, post your artwork and crafts on Instagram, make an amusing or insightful comment on Twitter. In turn, the applications would enable you to find an audience through the power of the global network.
What is your mission?
The mission we serve as Twitter, Inc. is to give everyone the power to create and share ideas and information instantly without barriers.
Facebook‘s mission is to give people the power to build community and bring the world closer together. People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.
YouTube’s mission is to give everyone a voice and show them the world. We believe that everyone deserves to have a voice, and that the world is a better place when we listen, share and build community through our stories.
The very design of the social media apps magnifies the negative behaviour and usage that undermines their stated missions.
The founders had grand visions for what their products might do, but they did not include soldiers of the caliphate advertising GoFundMe campaigns on Facebook and publishing beheadings on YouTube, or becoming platforms for attacks on journalists, elections, and scientific expertise. But as we shall see, the very design of the social media apps magnifies the negative behaviour and usage that undermines their stated missions.
Reality Distortion
In an interview on PBS Frontline, The Facebook Dilemma, Renée DiResta, a researcher at the Stanford Internet Observatory and advisor to the US congress and state department on state-sponsored information warfare, recounts one of her first encounters with coordinated misinformation online. In 2014 she was wading through the San Francisco pre-school application process for her son when she happened to notice that the vaccination rate in the local schools was incredibly low, at 38–40 percent. When she called her local congressman, he told her that whenever anyone tried to introduce legislation to change it there was a big outcry from the anti-vaccine movement and nothing would get done. When a measles outbreak occurred in Disneyland a few months later she called the congressman back and offered to help. What he told her was that when the bill to remove the vaccine opt-out made its way through the legislature the congress men and women would poll their constituents to see how they would vote for such a bill. The response from the public was 85% positive. But online, especially on Twitter, it was overwhelmingly negative.
Renée began working with the data scientist, Gilad Lotan, to map how the conversation on vaccines evolved over time. They discovered small groups of people who were coordinating their efforts across the media platforms of Twitter, Instagram, Pinterest, YouTube, and Facebook. They used a technique known as hashtag flooding. On Facebook groups and in nightly YouTube instructional videos they told people what to tweet, who to tweet at, and which hashtags to include. The group members created multiple twitter accounts to push the same message. This fooled the Twitter trending algorithms to interpret anti-vaccination as a groundswell of public opinion.
The good news is that the bill passed legislation, partly because they were able to show that anti-vaccination was not a majority opinion. But this event showed that the social media and search engines could be fooled into presenting a highly distorted view of people’s actual opinions. These distortions on social media have real-world effects. In 2019 the government of Samoa had to call an emergency lockdown to deal with a measles outbreak with over 6,700 cases and 83 deaths, as a result of a distributed social media campaign against vaccination from the United States.
Information Warfare
In 2016 the Myanmar military began a persecution campaign against the Muslim Rohingya, a minority group within the predominantly Buddhist country. At least 24,000 people were killed and over 700,000 fled or were driven from their homes into refugee camps in neighboring Bangladesh. It all started on Facebook. As Paul Mozur reports in his New York Times article, as early as 2013 the military deployed hundreds of personnel to a social media campaign to stir up fear and hatred between the Buddhist and Muslim population in Myanmar. Working in shifts they set up Facebook pages devoted to pop stars, models, and other celebrities.
They created many other accounts impersonating ordinary citizens. Once these accounts acquired a massive following, they began to post faked photos and reports of atrocities committed by the Rohingya. These faked accounts work by appearing as authentic people speaking of deeply held opinions and concerns. In one example an attractive, well-spoken young woman talks about the honor of running for the Miss Myanmar beauty pageant and how it promotes peace. The video clip then starts to portray faked images of destroyed villages and murdered children, supposedly caused by Muslims.
The Oxford Internet Institute’s global cyber troop inventory found formally organized media manipulation campaigns in 70 countries. A “cyber troop” may consist of a small number of people working for the government during elections or it may have many hundreds of full-time employees working all year-round. Primarily they work by spreading pro-government or pro-party propaganda across social media. Adopting the same strategy as protesters and social activists they use hashtag flooding. More sophisticated and better funded programs set up fake news sites and research institutes to present the appearance of unbiased blogs and articles by journalists. While many of these campaigns are simply pro-government propaganda, others are designed to suppress votes, harass opposition journalists, and to undermine democratic institutions.
The Axemaker’s Gift
Technology’s Capture and Control of Our Minds and Culture
James Burke and Robert Ornstein
Explore the double-edged history of human culture—how those with capacity for sequential analysis generated technologies to “cut and control” the world and and shape their community.
The Attention Merchants
The Epic Scramble to Get Inside Our Heads
Tim Wu
Columbia law professor Tim Wu explores how our attention, aided by the advent of mass marketing and increasingly sophisticated advertising techniques, has become one of the hottest commodities on the planet and the common currency of propagandists, media executives, and internet moguls.
In the series: Viral Ideas
Related articles:
Further Reading »
Further Reading
External Stories and Videos
Fake Facebook Pages Spurred Genocide in Myanmar
Kristin Houser, Future Society
For half a decade, the Myanmar military has used Facebook to fuel this fire against the Rohingya minority.
Facebook Will Now Show You Exactly How It Stalks You — Even When You’re Not Using Facebook
Geoffrey A. Fowler, Washington Post
The new ‘Off-Facebook Activity’ tool reminds us we’re living in a reality TV program where the cameras are always on. Here are the privacy settings to change right now.
‘Our Minds Can Be Hijacked’: The Tech Insiders Who Fear a Smartphone Dystopia
Paul Lewis, The Guardian
Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Meet the Silicon Valley refuseniks alarmed by a race for human attention.