Why You Hate Your Neighbor
With rising polarization, congressional gridlock, and increasing mistrust of political institutions, division has become a fact of life in American politics. Although numerous social and economic factors encourage division, the recent rise in political polarization can be—at least partially—attributed to a recent development: social media. Social media and the internet at large create a never-ending stream of entertainment and information. Yet, the constant distractions create an environment of intuition and impulsivity. Backed by algorithms that promote agreeable content, echo chambers, and the continuous spread of misinformation, we are ushered into a world of superficial delight with profound threats to our democratic institutions.
Information and Cognitive Overload
Sites like Twitter and Instagram supply users with mind-boggling streams of peer-generated data, causing us to be better connected and more informed. However, this never-ending stream of information is not wholly beneficial. Studies of web pages—pages with text, hyperlinks (clickable links that take you to a new page with more background information on the main text), videos, and graphics—show that these features actually decrease reading comprehension. As Nicholas Carr, a longtime journalist turned author, writes in his book The Shallows, “Evaluating links and navigating a path through them… involves mentally demanding problem-solving tasks that are extraneous to the act of reading itself.”1 The extra stimuli within a webpage require extra work to process. It is therefore harder to understand the content of a webpage than it is to understand that same content in plain print. In fact, a recent meta-analysis reviewing almost two decades of studies on the effect of digital reading concludes that evidence for the inferiority of digital learning is “robust” and that digital environments, in general, are not always “best suited to [foster] deep comprehension and learning.” Digital reading is verifiably worse for comprehension than print alternatives, but the fact that a text is accessed online does not make it harder to understand. Rather, the extraneous information within a webpage impairs comprehension. Since social media sites are just webpages with more widgets, data, and distractions, the decline in reading comprehension attributed to web pages applies to social media sites as well. The information on such sites makes reading more difficult and therefore reduces comprehension.
Social media sites make reading more difficult by overloading the sections of the brain dedicated to thoughtful analysis. In his book Thinking Fast and Slow, Nobel Prize-winning psychologist Daniel Kahneman calls the section of the brain committed to logical analysis System 2. System 2, he says, is associated with “the subjective experience of agency, choice, and concentration.” It works to solve difficult problems and control “the thoughts and actions ‘suggested’ by System 1.” System 1 is the impulsive counterpart of System 2. It “[operates] automatically… with little or no effort and no sense of voluntary control.” This system is responsible for our intuitions and emotions, as well as all of our “automatic” decisions. Systems 1 and 2 work together to help us analyze complex situations while also being able to make snap decisions. The two systems are, however, an imperfect pair. System 2 can only focus on a few topics at a time. It also requires a “shared pool of mental energy” which is expended through “all variants of voluntary effort–cognitive, emotional, or physical.” This mental energy is rapidly expended through social media use. As we scroll through Instagram or Twitter, our brains are actively deciding between liking a post, disliking it, commenting, or sharing it— in addition to digesting the content we’re viewing. We are constantly analyzing multiple forms of information—visual, audio, and social—such that System 2 is forced to continuously switch methods of analysis and topics. This rapid switching imposes what psychologists call a “switching cost,” which is particularly taxing on mental resources. This sort of multitasking is what cognitive neuroscientist Jordan Grafman claims makes people “less deliberative [and] less able to think and reason out a problem.” Processing extraneous information on a social media site quickly tires System 2, depleting the mental energy normally reserved for processing, analyzing, and recognizing the nuance in text, political arguments, and complex situations. Social media sites, therefore, reduce comprehension and complexity of thought by wasting the fuel required for logical thought.
The depletion of mental energy is the key to understanding the effects of social media on political polarization. As System 2 gets tired, it expedites tasks to its neighbor, System 1. We become more impulsive, selfish, and likely to rely on stereotypes., Furthermore, the distractions inherent to social media sites also reduce the ability “to experience… human forms of empathy, compassion, and other kinds of emotions.” As we use social media sites, we become more bigoted and less empathetic. Social media sites, therefore, encourage prejudices and superficiality—not through direct policies, but instead through the unintended consequences of their design. Considering that half of US adults get their news from social media, a large portion of the voting population interacts with politics in an environment where they’re primed to rely on intuitive prejudices and stereotyping. Some researchers have even found that the extent to which Americans rely on political prejudices is so great that Americans rate members of the opposing political party as 20-30% less evolved than the average human—a tendency only amplified by the cognitively demanding environment of social media. The dehumanization inherent in these prejudices can be used to predict metadehumanization— or “the perception that another group dehumanizes your own group.”13 Metadehumanization, in turn, reduces “Americans’ support for democratic norms.”13 Social media furthers the already hostile climate of American politics and worsens our ability to find common ground by priming us to give into instinctual prejudices—resulting in a dangerous erosion of our appreciation of democracy.
Trust and Misinformation
Bigotry and dehumanization result in the breakdown of trust in the opposite party, but mistrust can also result from fairly rational and non-prejudiced means. As mathematician James Weatherall and behavioral scientist Cailin O’Connor argue in their book The Misinformation Age, mistrust often results from simple differences in opinion. If one trusts their own views and believes they have evaluated evidence well, then it follows that those who agree with them have also evaluated the evidence well and therefore also have good judgment. On the other hand, if two people disagree, they’re more likely to mistrust the other’s judgment. Using this conception of trust, O’Connor and Weatherall were able to demonstrate that, even in social networks of rational agents in which members evaluated information with respect to both the empirical evidence and the individual trust that members had in the source of the evidence, highly polarized cliques began to form. As agents became more or less convinced of a premise, they began to trust new sources of information with views more similar to theirs. In fact, agents would often discount information entirely if they deemed the source of that information as too radical. Since it makes sense to weigh the strength of information against the credibility of the provider, it's completely rational to distrust “radical” sources. And yet, even in this rationality, polarization occurs. Polarization, then, is often a result of the pre-established views of individual actors and which sources of news they choose to trust.
Here, social media again tips the scale in favor of polarization by designing algorithms that personalize the experience of users. In a review of the current academic literature on social media and polarization, researchers have found that people seek out and engage with political information that reinforces pre-established views. Although studying the effects of algorithms is difficult, the review was able to conclude that sites like Facebook “increasingly align content with cues about users’ political ideology.” Social media algorithms react to what users engage with, and what users engage with is highly polarizing political content.15 Users are therefore exposed to agreeable political content, with an increasing focus on polarization. According to O’Connor and Weatherball’s network models, this makes individuals more likely to continuously update their views with increasingly radical content. The eventual disconnect between opinions sows further mistrust between opposing political ideologies, reducing the likelihood that individuals will moderate their views with evidence from opposing sources. Furthermore, studies have found that “falsehood [diffuses] significantly farther, faster, deeper, and more broadly than the truth in all categories of information [on social media].” Since misinformation spreads so quickly on social media sites, platforms risk becoming propaganda machines—where fake news is seen more frequently than real news, and the division between reality and illusion thins. The information which causes radicalization, therefore, may often be false—especially since the similarity of views leads to higher trust in the information. The result is an intensely polarized news feed where moralization, anger, and misinformation are more incentivized than calm, nuanced, and empirical analysis. Since users of these sites are already primed to rely on prejudices, they may be more open to mistrust of the opposing side and therefore to radicalization. Yet natural tendencies to trust those with similar views are all that is needed for the hyper-partisan content on social media to further radicalize individuals.
As social media sites continue to bombard users with highly compact and diverse forms of information, they also deplete the mental energy used for logical analysis. This process results in significant reductions in comprehension as well as more bigoted and less empathetic judgments. Social media sites therefore prime users to believe in dehumanized caricatures of political opponents. As users begin to perceive their political opposites as malicious adversaries, they begin to mistrust them. Yet mistrust can also result from rational agents with no assumptions of malice, so long as agents hold different views. Mistrust makes it more likely that groups of people polarize. In this sense, social media websites create a reinforcing loop of radicalization where depletion of energy initially encourages mistrust and polarization and the resulting difference of opinions furthers them. Algorithms also contribute to this by segregating groups of individuals based on political opinions. These echo chambers facilitate the rapid spread of self-validating information. Algorithms also incentivize polarizing content and misinformation, often leading to mistrust and thus more polarization. Social media sites, therefore, encourage widespread political polarization. The resulting political gridlock and metadehumanization erode faith in our democratic institutions; political polarization from social media actively undermines our democracy.
Sources
1. Akrati Saxena, Pratishtha Saxena, Harita Reddy, Fake News Detection Techniques for Social Media, Principles of Social Networking, (325-354), (2022).
2. Cailin O'connor and James Weatherball. MISINFORMATION AGE : How False Beliefs Spread. S.L.: Yale University Press, 2020.
3. Carr, Nicholas. The Shallows: How the Internet Is Changing the Way We Think, Read and Remember. London, England: Atlantic Books, 2020.
4. Delgado, Pablo, Vargas, Cristina, Ackerman, Rakefet, and Ladislao Salmeron. "Don't throw away your printed books: A meta-analysis on the effects of reading media on reading comprehension." Educational Research Review 25, (2018): 23-38. Accessed November 5, 2022.
5. Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus, and Giroux. P, 2011.
6. Landry, Alexander, Ihm, Elliot, Kwit, Spencer, and Jonathan Schooler. "Meta Dehumanization erodes democratic norms during the 2020 presidential election." Analyses of Social Issues and Public Police 21, no. 1 (2021): 51-63. Accessed November 5, 2022.
7. “Social Media and News Fact Sheet.” Pew Research Center’s Journalism Project. September 20, 2022. https://www.pewresearch.org/journalism/fact-sheet/social-media-and-news-fact-sheet/.
8. Tapscott, Don. Grown up Digital : How the Net Generation Is Changing Your World. New York ; Toronto: Mcgraw-Hill, 2009.
9. Van Bavel, Jay, Rathje, Steve, Harris, Elizabeth, Robertson, Claire, and Anni Sternisko. "How Social Media Shapes Polarization." Trends in Cognitive Sciences 25, no. 11 (2021): 913-915. Accessed November 5, 2022.