By Shawn Fuller
We are in the midst of the greatest revolution in communication technology since humanity spread across the globe, exceeding in scope and impact the printing press. A change of this magnitude to how we communicate with each other requires us to adapt both as individuals and as nations. The experts in the fields of media, journalism, and communications have been warning us for years that social media is using us more than we are using it.
The Coronavirus pandemic touched the lives of everyone on the globe. Within a few months millions of people lost loved ones, or lost their livelihood, or else were overworked to the breaking point, as they worked on the frontlines of health and human services. It changed our day-to-day activities: how we worked, how we shopped, and how we connected with our family and friends. With lockdowns in almost every country, the major streets and thoroughfares emptied of traffic and an unfamiliar silence descended upon our cities. Meanwhile, the digital transformation, already changing our society, went into overdrive. We spent even more time hunched in front of computer screens or staring at devices as we worked from home through Google meets, Zoom, Slack, and digital whiteboards. We flocked to the streaming platforms of Netflix, YouTube, and Amazon Prime to escape the boredom of lockdown and the stress of everyday life. In place of socializing over coffee, lunch, or dinner we visited and played games over Zoom and Skype.
With these networked tools at our disposal, we responded to the pandemic with a speed and global reach that would have been impossible only a few years earlier. The growing trend towards open data and open science allowed scientists to share their findings across borders, even between opposing countries. The World Health Organization (WHO) and national medical authorities communicated to the public more rapidly through Facebook, YouTube, Twitter, and TikTok than they ever could through government and traditional media.
The data scientist Sinan Aral and his team partnered with Facebook to model the spread of COVID-19, to better inform national and international health organizations. By using anonymized and aggregated data from Facebook’s disease prevention maps, they compared population movements within cities and regions across the world with their respective social distancing orders. This allowed them to understand which ones were more effective. It clearly demonstrated how the combined growth of public networks, big data, and ubiquitous computing could be used to address major threats to public health.
The pandemic also made very clear the growing fault-lines in our society. The WHO had been preparing for a world-wide pandemic for years. But they were not prepared for the infodemic, the epidemic spread of misinformation, that accompanied COVID-19. The Stanford Internet Observatory’s Virality Project tracked conspiracy theories and misinformation promoted on Facebook and Twitter. They found that misinformation was most often true information that had been twisted, spun, or taken out of context, rather than completely fabricated. False or misleading claims most often centered on the actions or policies of authorities, including government and international bodies like the UN or WHO. It became very clear that while the digital platforms have amplified how much and how quickly we can communicate, they also amplify our growing loss of trust in institutions and government, which are necessary for coordinated and coherent responses to global problems. To their credit Facebook, YouTube, and Twitter worked hard to promote information from reliable medical sources and to either label or remove content promoting false cures and conspiracy theories. But this too was met with distrust and claims of censorship.
Kate Starbird and her team at the University of Washington study how people use social media during crisis events. Rumouring and information sharing are basic human responses to crisis and uncertainty. The social media platforms have enabled us to do this at an unprecedented scale and speed. They have also made it much easier for groups to deliberately manipulate this process by exploiting the rumouring process itself.
Misinformation, crisis, and public health Kate Starbird, Emma S. Spiro, and Kolina Koltai
Renée DiResta is a researcher at the Stanford Internet Observatory and has advised the US congress and state department on state-sponsored information warfare. In an interview on PBS Frontline, The Facebook Dilemma, she recounts one of her first encounters with coordinated misinformation online. In 2014 she was wading through the San Francisco pre-school application process for her son when she happened to notice that the vaccination rate in the local schools was incredibly low, at 38–40 percent. When she called her local congressman, he told her that whenever anyone tried to introduce legislation to change it there was a big outcry from the anti-vaccine movement and nothing would get done. When a measles outbreak occurred in Disneyland a few months later she called the congressman back and offered to help. What he told her was that when the bill to remove the vaccine opt-out made its way through the legislature the congress men and women would poll their constituents to see how they would vote for such a bill. The response from the public was 85% positive. But online, especially on Twitter, it was overwhelmingly negative.
Renée began working with the data scientist, Gilad Lotan, to map how the conversation on vaccines evolved over time. They discovered small groups of people who were coordinating their efforts across the media platforms of Twitter, Instagram, Pinterest, YouTube, and Facebook. They used a technique known as hashtag flooding. On Facebook groups and in nightly YouTube instructional videos they told people what to tweet, who to tweet at, and which hashtags to include. The group members created multiple twitter accounts to push the same message. This fooled the Twitter trending algorithms to interpret anti-vaccination as a groundswell of public opinion.
The good news is that the bill passed legislation, partly because they were able to show that anti-vaccination was not a majority opinion. But this event showed that the social media and search engines could be fooled into presenting a highly distorted view of people’s actual opinions. These distortions on social media have real-world effects. In 2019 the government of Samoa had to call an emergency lockdown to deal with a measles outbreak with over 6,700 cases and 83 deaths, as a result of a distributed social media campaign against vaccination from the United States.
In 2016 the Myanmar military began a persecution campaign against the Muslim Rohingya, a minority group within the predominantly Buddhist country. At least 24,000 people were killed and over 700,000 fled or were driven from their homes into refugee camps in neighboring Bangladesh. It all started on Facebook. As Paul Mozur reports in his New York Times article, as early as 2013 the military deployed hundreds of personnel to a social media campaign to stir up fear and hatred between the Buddhist and Muslim population in Myanmar. Working in shifts they set up Facebook pages devoted to pop stars, models, and other celebrities.
They created many other accounts impersonating ordinary citizens. Once these accounts acquired a massive following, they began to post faked photos and reports of atrocities committed by the Rohingya. These faked accounts work by appearing as authentic people speaking of deeply held opinions and concerns. In one example an attractive, well-spoken young woman talks about the honor of running for the Miss Myanmar beauty pageant and how it promotes peace. The video clip then starts to portray faked images of destroyed villages and murdered children, supposedly caused by Muslims.
The Oxford Internet Institute’s global cyber troop inventory found formally organized media manipulation campaigns in 70 countries. A “cyber troop” may consist of a small number of people working for the government during elections or it may have many hundreds of full-time employees working all year-round. Primarily they work by spreading pro-government or pro-party propaganda across social media. Adopting the same strategy as protesters and social activists they use hashtag flooding. More sophisticated and better funded programs set up fake news sites and research institutes to present the appearance of unbiased blogs and articles by journalists. While many of these campaigns are simply pro-government propaganda, others are designed to suppress votes, harass opposition journalists, and to undermine democratic institutions.
We have created a world in which information flows more freely and efficiently than ever before and yet we are finding it harder to understand one another and to agree on basic facts about reality. It is natural for people to disagree on important issues, but social media magnifies the political, cultural, and ethnic divisions within our societies, almost as if they were designed to do so.
The social media applications were intended to foster personal expression: to publish a home video on YouTube, share what you are feeling on Facebook, post your artwork and crafts on Instagram, make an amusing or insightful comment on Twitter. In turn, the applications would enable you to find an audience through the power of the global network.
What is your mission?
The mission we serve as Twitter, Inc. is to give everyone the power to create and share ideas and information instantly without barriers.
Facebook‘s mission is to give people the power to build community and bring the world closer together. People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.
YouTube’s mission is to give everyone a voice and show them the world. We believe that everyone deserves to have a voice, and that the world is a better place when we listen, share and build community through our stories.
The founders had grand visions for what their products might do, but they did not include soldiers of the caliphate advertising GoFundMe campaigns on Facebook and publishing beheadings on YouTube, or becoming platforms for attacks on journalists, elections, and scientific expertise. But the very design of the social media apps magnifies the negative behaviour and usage that undermines their stated missions.
The Axemaker’s Gift
In their book, The Axemaker’s Gift, James Burke and Robert Ornstein show us that every tool or technology we invented, from the hand axe chipped out of stone to the printing press, took on a life of its own. Each gift changed humanity in ways that were neither suspected nor predicted by the axemakers and their delighted recipients. Most of the tools gave us increasing control over our environment. We could hunt and fish more efficiently and cut, burn, and plow mass tracts of land for our own use. We tend to view the invention of numbers and writing as expanding the abilities of the human mind. But they were used by early civilizations to monitor and control their populaces. Occasionally, however, the authorities would lose control of a new technology.
The printing press began as a money-making scheme, which Guttenberg needed to salvage a serious investment blunder. It grew rapidly in popularity as both the church and city governors adopted it. Had printing been centralized, say in Rome, or available only to governments and the church it would have remained only a broadcasting technology, similar to today’s newspapers and books. But printers set up shop in many towns and were willing to sell their services to anyone who would pay. This made it a networked communication technology as well, similar to desktop publishing and the internet. Without it, Martin Luther’s fate would likely have been the same as that of Jan Hus, an earlier church reformer, who was burned at the stake. Instead, as Niall Ferguson points out in The Square and the Tower, by the time the pope had heard about Luther and tried to excommunicate him, Luther had already published his 95 complaints in 30 different pamphlets, selling 300,000 copies.
The Reformation was more than just Protestants against Catholics. Once the authority of the Church as the single source of truth had been challenged it did not simply result in two alternative faiths. Part of the bloodshed that ensued over the next 30 years resulted from protestants splitting into factions, each differing on points of theology.
The eventual solution to the problem involved a two-fold process. One was through a series of treaties by the recently formed European powers, collectively called the Peace of Westphalia. A new set of hierarchies (the European states) exercised control over networked dissention and dispute. The other process was the imposition of standardized, graded, and officially recognized systems of education developed in both Protestant and Catholic countries, which were intended to foster trusted and society-wide means for identifying truth.
In just over a decade social media has disrupted the major institutions of our society. A command of social media is now essential to any government that wishes to stay in power. As Peter Singer & Emerson Brooking show in their book, LikeWar: The Weaponization of Social Media, social media has become the new battleground, in which the goal is to shape and distort our perceptions of reality. While there is nothing new about attempting to shape perception for war or profit, the extent to which social media is now an integral part of our information ecosystem changes everything. Newspapers and magazines have lost much of their ad revenue and local newspapers have largely gone out of business. According to a Report by the Pew Research Center, in 2018, about two-thirds of American adults say they at least occasionally get their news from social media and one-in-five say they often get their news via social media. In fact, traditional news organizations depend on social media and Google to bring people to their sites. It is another channel through which articles, images, and video generated by traditional news gets disseminated to the public.
“If you are online, your attention is like a piece of contested territory, being fought over in conflicts that you may or may not realize are unfolding around you. Everything you watch, like, or share represents a tiny ripple on the information battlefield, privileging one side at the expense of others. Your online attention and actions are thus both targets and ammunition in an unending series of skirmishes. Whether you have an interest in the conflicts of LikeWar or not, they have an interest in you.”
P.W. Singer & Emerson Brooking. LikeWar: The Weaponization of Social Media. Houghton Mifflin Harcourt.
We are in a strange situation where the majority of people in the world use social media every day, yet we understand little of its nature and influence on us, other than that it is very addictive. If we are to benefit from digital technology as we eventually did from the alphabet and the printing press, we need to understand better how it works and how it plays upon our own mental systems. Every technology requires a set of norms and safety practices around its use. Electrical fires and electrocutions are rare because of the combined effect of public education, laws, industry practices, and trusted experts. We will need to develop comparable practices for social media.
Kristin Houser, Future Society
For half a decade, the Myanmar military has used Facebook to fuel this fire against the Rohingya minority.
The Science of Fake News
The amount of false news online is clearly increasing, with serious consequences. It can drive the misallocation of resources during terror attacks and natural disasters, the misalignment of business investments, and misinformed elections. Unfortunately, the scientific understanding of how and why false news spreads is currently based on ad hoc rather than large scale systematic analysis.
In one study, a research team at MIT analyzed all the fact-checked rumors, true and false, that spread on Twitter from 2006–2017 — some 126,000 stories tweeted by about 3 million people more than 4.5 million times. The study found that false news spreads far more extensively than the truth, with false stories some 70% more likely to be retweeted.
The study also overturns conventional wisdom about how false news spreads. Though recent congressional testimony has focused on the role of automated bots, this study concludes that human behavior contributes more than bots to the differential spread of falsity and truth on Twitter.
One explanation may be that false rumors are measurably more novel than true ones, but whether or not users perceived them as such is difficult to establish. In an effort to assess users’ perceptions, the study compared the emotional content of replies to 32,000 Twitter hashtags, using a list of about 140,000 English words and their associations with eight emotions: anger, fear, anticipation, trust, surprise, sadness, joy and disgust. They found that false rumors inspired replies expressing greater surprise, corroborating the novelty hypothesis, and greater disgust. The truth inspired replies that expressed greater sadness, anticipation, and trust.
These findings may illuminate additional factors beyond novelty that inspire people to share false news but more research is needed. For example, it may also be that people share news because it is surprising rather than because it is believed, which is an important difference. In any case, the study suggests that as well as curtailing robots, containment policies should also emphasize behavioral interventions such as labeling and incentives to dissuade the spread of misinformation.
Based on “The Science of fake news”and “The spread of true and false news online,” March 9, 2018 issue of Science.
Geoffrey A. Fowler, Washington Post
The new ‘Off-Facebook Activity’ tool reminds us we’re living in a reality TV program where the cameras are always on. Here are the privacy settings to change right now.
Paul Lewis, The Guardian
Google, Twitter and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Meet the Silicon Valley refuseniks alarmed by a race for human attention.
The Epic Scramble to Get Inside Our Heads
Columbia law professor Tim Wu explores how our attention, aided by the advent of mass marketing and increasingly sophisticated advertising techniques, has become one of the hottest commodities on the planet and the common currency of propagandists, media executives, and internet moguls.
James Burke and Robert Ornstein
Explore the double-edged history of human culture—how those with capacity for sequential analysis generated technologies to “cut and control” the world and and shape their community.