Social Media, Fake News, and Violence in the Developing Countries

“It’s now clear that so-called fake news can have real-world consequences…. This is not about politics or partisanship. Lives are at risk… lives of ordinary people just trying to go about their days, to do their jobs, contribute to their communities.” (Hillary Clinton, 8 December 2018)

Introduction

In the 20th century, traditional mass media such as newspaper, radio, and television were the prominent source of information that shaped collective knowledge and public opinion (for example McQuail, 2010; Silverstone, 2003; Van Dijk, 2013). The internet has altered the information landscape where it flattened the hierarchical structure of communication (Friedman, 2005) and handed over the power of information from mass media to the people. Moreover, the rise of the internet, mainly social media, amplifies further anyone’s capability to voice ideas through personal connections and anyone can be an information source as powerful as traditional media.

Even though information decentralization has empowered people, a new problem has emerged, misinformation and disinformation, now called ”fake news”; misinformation and disinformation warp and distort reality and subsequently create social conflicts, even incite violence (Soll, 2016; Wendling, 2018). In Myanmar, Facebook fanned out hatred against a minority, Muslim Rohingya, and lead to ethnic cleansing (Specia & Mozur, 2017). It was reported 600,000 Rohingya fled from Myanmar in 2017 to avoid persecution (McLaughlin, 2018). In a similar case in Sri Lanka, fake news went viral on Facebook and invoked religious-based violence resulting in rioting and lynching (Taub & Fisher, 2018).

This paper begins with a brief description of fake news cases in Myanmar and Sri Lanka and follows by developing a theoretical framework to examine the cases. Analysis, discussion, conclusion, and suggestions are provided in the latter part of the paper.

Case studies

Sri Lanka

It took only one Facebook post about a traffic accident to instigate mob violence and killing directed at the Muslim minority in Sri Lanka. Information about a minor traffic accident involving a person from the Tamil Muslim minority and the Sinhalese Buddhist majority was twisted into a story about an alleged Muslim plot to decimate the Buddhist majority and has led Sri Lanka into rioting and racial killing (Taub & Fisher, 2018; Goel et al, 2018). The violence took its toll when dozens of minority businesses, houses, and at least one mosque were assailed, and at least one Tamil was killed (Goel et al, 2018). The government blamed Facebook, WhatsApp, and Instagram for their role in amplifying hate speech and for a failure to filter the spread of the speech in the platforms. Subsequently, those social media services were temporarily shut down in the country (Taub & Fisher, 2018).

Myanmar

Without Facebook, 600,000 Rohingya would not have fled their land to avoid ethnic cleansing. The ethnic cleansing was started by an ultranationalist Buddhist monk who spread serial fake reports that promulgated hateful information about the Rohingya. The Rohingya were depicted as an aggressive outsider who wanted to take over the country (McLaughlin, 2018). This false information has sparked violence leading to the lynching and ethnic cleansing afterward. International Human Right groups have severely criticized Facebook for being dormant and the company could have done more to prevent the humanitarian crisis (Specia & Mozur, 2017). To put it in perspective, in Myanmar Facebook cooperates with local ISPs to provide free internet service, so people get most of their news from Facebook and it is a very powerful medium to shape public opinion. The company brushed aside the accusation and was reluctant to take responsibility or any measure that can alleviate fake news circulation in Myanmar (McLaughlin, 2018).

 Theoretical Framework

Misinformation and disinformation containing hate speech are becoming ubiquitous across social media platforms and few efforts have been done either by policymakers or social media firms to remedy the issue. Understanding factors that influence the spread of fake news would be indispensable and it can offer a framework to prompt policy initiatives.

An individual, as the message source and the information recipient, is at the center of the problem and to comprehend human information processing it is necessary to elucidate the cognitive processes underlying fake news consumption and sharing. There are two assumptions that are correlated with the processes: there is an abundance of information and humans as a species are cognitive misers. The former identifies that there is a superabundance of information that an individual faces in daily life and the theory of selective exposure states that an individual only picks content that suits their predispositions (Sears & Freedman, 1967). Thus, this bias makes a person fallible to manipulation, while on the other hand, crosscutting information will enlarge one’s perspective and make them an informed citizen (Hagen, 1997). The latter refers to the cognitive processing that is also inherently flawed since information can be processed through either the rational path (central) or the heuristic path (peripheral) (Petty & Cacioppo, 1986). This two-way model, called “elaboration likelihood model,” hypothesizes information coming from an interpersonal link would likely be processed through the peripheral route as a personal connection cueing that information as trustworthy and believable. A close tie is always considered a credible source. Selective exposure and peripheral cognitive processing make a person gullible to fake news, and social media makes the matter worse by reinforcing existing predispositions through the person’s ego network.

Demographics correlate with information literacy (Leung, 2010) and news consumption (Guess et al, 2018). Factors such as age, income, level of education, and geographical location predict the levels of literacy. People who come from upper and middle-upper socioeconomic class have more resources to get news from diverse sources than their counterparts. Crosscutting news consumption would make them insulated from fake news and all information should be examined critically before sharing with others. Furthermore, identity politics is more salient among middle-low and low class (Bennett, 2008) people because people from these classes are prone to false information since they have no adequate resources to evaluate fake news.

The internet interacts with people and it affects the socio-technical arrangement in society. Social media enables a novel means to distribute and to consume information where interpersonal and mass communication are commingled, and a person’s voice can reach a mass audience instantaneously. It bypasses gatekeeping processes and at present social media is the main source of news — 62% of American adults get news from social media and 18% of them are frequent users (Gottfried & Shearer, 2016). The algorithms curate newsfeed flows that personalizes information and these algorithms are less likely to offer impartial news sources. A person who reads the news from liberal sources is less likely to encounter information from conservative media outlets and vice versa. This promotes the echo chamber effect (Sunstein, 2001) that forces individuals to live in a filter bubble (Pariser 2011) which information is disseminated only within individuals who share a similar value and interest. Polarization is widening, and it is likely that deliberative endeavor can occur in social media. Moreover, social media consumption has also replaced offline social activities and this displacement effect cost them strong social ties (Kraut et al, 1998). A person may have hundreds or even thousands of connections in social media, but they only engage with a few in the real life. Living in a bubble and barely having social connections would make the effect of fake news stronger.

Figure 1. A framework to understand the fake news issue

            Freedom of information is in the heart of internet governance and no regulation should be enacted to constraint a right to express ideas and to receive information. Although some countries like China and Singapore employ draconian measures and build a firewall to protect their sovereignty, most countries utilize an open internet system. The fake news case is pitting the idea of freedom of speech against censorship which very unlikely a company like Facebook would censor information in the platform. Yet, a case in Myanmar or Russian meddling in the US election can be a turning point to redefine freedom of information in the fake news era. There should be a boundary between the right to speech and a speech that promotes deception and violence.

Analysis and Discussion

Individual’s cognitive processes, socio-technicality, demographic and internet governance are four main factors associated with fake news (see Fig.1). Each factor is also interdependent and affects one another and for example, individual factor is closely related to demographic as the latter can predict individual’s news consumption (Guess et al, 2018).

In Myanmar and Sri Lanka people who are exposed to fake news are coming from middle and low socioeconomic class and they live in a rural area which Facebook is the main source of information (Taub & Fisher, 2018; Specia & Mozur, 2017). The fake news affect is salient among them because it diffuses through personal connection (i.e. Facebook and WhatsApp) and information is considered trustworthy. Social media algorithms escalate the issue because it promotes homogenous information that magnifies filter bubble. Other media are available outside social media, but they don’t use them in getting crosscutting information. To make the matter worse, the flag system is in place, but it is ineffective because Facebook doesn’t resort enough human resources to filter and to take down hate speech in the platform (McLaughlin, 2018). Facebook is reluctant in bringing down fake news because there is no internet policy or regulation that requires the company to take measures against false information. Even in the Russia meddling in the US election case, the congress can’t punish Facebook for its failure to filter fake news.

All factors are interconnected and simultaneously affect the fake news dissemination and consumption in Myanmar and Sri Lanka. The local government has less power than social media firms to tackle the spread of fake news. Shutting down local internet services may be one of the options, but it kills a fly with a sledgehammer approach. Facebook has financial and technological resources than the government and civil society, therefore the company has more responsibility to actively filter the spread of false information in the platform. Furthermore, improving the flag system and allocating adequate human resources to scrutiny all flag reports are the best practice to ameliorate the spread of fake news now. In the future, the company shall improve algorithms that can detect fake news faster in the most efficient ways. Facebook yields enormous profits from users and the company has a social responsibility to fix the damage that is caused by fake news.

Conclusion and Suggestion

The problem is complex, encompassing the micro, the mezzo, and the macro level, and the solution would require a multi-actor approach of individual, society, internet firms, and the government, respectively. Improving the socio-economic condition to better individual information literacy would not suffice: a new arrangement in the mezzo and the macro level should also be promoted, such as developing a new social media algorithm to endorse crosscutting information and safer internet governance to protect the people. All actors should work together to find a common ground to overcome the fake news issue.

Freedom of information is in the heart of the internet, and fake news is the byproduct of this freedom. Global internet governance is required as more people from developing countries are using social media and the fake news issue can exacerbate further as more individuals depend on social media in getting news and general information. A local government is less likely to have the power and resources to regulate social media behemoth firms, so International agencies such as the UN or ITU may intervene to protect people in the emerging economies from social media exploitation as a mere market. Moreover, a country which has strong government may have the power to control fake news flows, yet it can be misused to control dissenting opinions and oppositions to the ruler. A 17-year-old Singaporean blogger was accused by the government for spreading religious hatred, while his blog has shown open criticism toward the government (The Guardian, 2017).

There should be an independent third party assigned to control and to filter fake news distribution in the social media Newsfeed. This party should be independent and consist of experts from diverse fields to serve the best interest of society. Sri Lanka and Myanmar have warned us about the danger of fake news and a new measure should be enacted to mitigate further conflicts and violence caused by fake news in the social media platforms.

References:

Bennett, W. L. (2008). “Changing citizenship in the digital age.” In W. Lance Bennett (ed.), Civic Life Online: Learning How Digital Media Can Engage Youth (pp.1-24). The John D. and Catherine T. MacArthur Foundation Series on Digital Media and Learning. Cambridge: The MIT Press.

Friedman, T. L. (2005). The World is Flat: A Brief History of the Twenty-First Century. Macmillan.

Guess, A., Nyhan, B., & Reifler, J. (2018). Selective exposure to misinformation: Evidence from the consumption of fake news during the 2016 US presidential campaign. European Research Council.

Goel, V., Kumar, H. & Frenkel, S. (2018, March 8). In Sri Lanka, Facebook contends with shutdown after mob violence. The New York Times. Retrieved from: https://www.nytimes.com/2018/03/08/technology/sri-lanka-facebook-shutdown.html

Gottfried, J. & Shearer, E. (2016). News use across social media platforms 2016. Pew Research Center. Retrieved from: http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/

Hagen, I. (1997). Communicating to an ideal audience: news and the notion of the informed citizen. Political Communication14(4), 405-419.

Kraut, R., Patterson, M., Lundmark, V., Kiesler, S., Mukophadhyay, T., & Scherlis, W. (1998). Internet paradox: A social technology that reduces social involvement and psychological well-being? American Psychologist, 53(9), 1017-1031.

Leung, L. (2010). Effects of Internet connectedness and information literacy on quality of life. Social Indicators Research98(2), 273-290.

McQuail, D. (2010). McQuail’s Mass Communication Theory. Sage publications.

McLaughlin, M. (2018, July 6). How Facebook’s rise fueled chaos and confusion in Myanmar. Wired. Retrieved from: https://www.wired.com/story/how-facebooks-rise-fueled-chaos-and-confusion-in-myanmar/

Pariser, E. (2011). The Filter Bubble: How the New Personalized Web Is Changing What We Read And How We Think. Penguin.

Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. In Communication and persuasion (pp. 1-24). Springer, New York, NY.

Silverstone, R. (2003). Television and Everyday Life. Routledge.

Singapore teen blogger who criticised government wins asylum in US. (2017, March 24). The Guardian. Retrieved from: https://www.theguardian.com/us-news/2017/mar/25/singapore-teen-blogger-who-criticised-government-wins-asylum-in-us

Sears, D. O., & Freedman, J. L. (1967). Selective exposure to information: A critical review. Public Opinion Quarterly31(2), 194-213.

Specia, M. and Mozur, P. (2017, October 27). A war of words puts Facebook at the center of Myanmar’s Rohingya crisis. The New York Times. Retrieved from: https://www.nytimes.com/2017/10/27/world/asia/myanmar-government-facebook-rohingya.html

Soll, J. (2016, December 18). The long and brutal history of fake news. POLITICO Magazine. Retrieved from: https://www.politico.com/magazine/story/2016/12/fake-news-history-long-violent-214535

Sunstein, C. R. (2001). Republic.com. Princeton University Press.

Taub, A. & Fisher, M. (2018, April 21). Where countries are tinderboxes and Facebook is a match. The New York Times. Retrieved from: https://www.nytimes.com/2018/04/21/world/asia/facebook-sri-lanka-riots.html

Van Dijk, T. A. (2013). News as Discourse. Routledge.

Wendling, M. (2018, January 22). The (almost) complete history of ‘fake news.’ BBC News. Retrieved from: https://www.bbc.com/news/blogs-trending-42724320

 

Advertisements

About whisnutriwibowo

A man who strives to be a better human being, no more no less
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s