Skip to main contentdfsdf

Home/ mark grabe's Library/ Kindle/ Breaking the Social Media Prism: How to Make Our Platforms Less Polarizing (Christopher Bail)

Breaking the Social Media Prism: How to Make Our Platforms Less Polarizing (Christopher Bail)

  • A study of Twitter reached similar conclusions. More than three-quarters of the people who retweet—or share—a message, the study concluded, belong to the same party as the message’s author.
  • But I believe the common wisdom about social media, echo chambers, and political polarization may not only be wrong, but also counterproductive.
  • I became so concerned about political tribalism that I founded the Polarization Lab—
  • will argue that our focus upon Silicon Valley obscures a much more unsettling truth: the root source of political tribalism on social media lies deep inside ourselves.
  • social media is more like a prism that refracts our identities—leaving us with a distorted understanding of each other, and ourselves.
  • “chicken or the egg” problem: do our social media networks shape our political beliefs, or do our political beliefs determine who we connect with in the first place?
  • But it occurred to us that bots could also be repurposed for valuable research. Instead of spreading misinformation, bots could expose people to different points of view.
  • We decided to build two bots: one that retweeted messages from prominent Republicans and another that shared messages from prominent Democrats. We could pay a large group of Democrats and Republicans to follow the bot from the opposing political party, we reasoned, and survey them before and after
  • elected to run our experiment on Twitter,
  • 1,220 Americans who used Twitter at least three times per week and identified with either the Democratic or Republican Party to answer a range of questions about their political beliefs and behaviors. We asked the respondents a series of ten questions about social policy issues such as racial inequality, the environment, and government regulation of the economy.
  • In addition to measuring their beliefs with a traditional public opinion survey, we also asked respondents to share their Twitter handles with us. This allowed us to track their behavior before and after the experiment and also account for the strength of their echo chambers before they followed our bots.
  • Each hour, our bots randomly selected one of these accounts that had tweeted within the previous hour and retweeted its message. Thus,
  • we told them they would receive $11 for following a bot that would retweet twenty-four messages each day for one month, without telling them anything about the messages themselves.
  • To measure how much study participants complied with the treatment we were trying to give them, we offered them up to eighteen additional dollars if they could both identify an animal that our bots retweeted during the study period (but was later deleted) and answer questions about the content of our bot’s messages.
  • In mid-November 2017, we sent everyone in the study a survey that asked the same questions about social policies we had asked them one month earlier.
  • By comparing how much people who followed our bots moved on our liberal-conservative scale to the movement of those in our control condition, we were finally able to learn what happens when people step outside their echo chambers.
  • the figure shows, neither Democrats nor Republicans became more moderate when they followed our bots.
  • On average, Republicans who followed our Democratic bot for one month expressed markedly more conservative views than they had at the beginning of the study.17 And the more attention they paid to our bots, the more conservative they became. The results were less dramatic for Democrats.
  • Exposing people to views of the other side did not make participants more moderate. If anything, it reinforced their preexisting views.
  • Why didn’t taking people outside their echo chamber make them more moderate?
  • The worst of these attacks were previously obscured by her echo chamber, but now Patty was experiencing the full scale of partisan warfare for the first time. Unlike the unenthusiastic partisans we interviewed in our control condition, Patty came to realize that there was a war going on, and she had to choose a side. The
  • Because of the mounting evidence that our political identities shape the way we understand the world around us, social scientists have mostly abandoned the idea that people dispassionately deliberate about the merits of each other’s arguments.
  • A seminal example of how social contexts shape the way humans develop our identities is the sociologist Charles Horton Cooley’s notion of the looking-glass self.
  • The deeper source of our addiction to social media, I’ve concluded, is that it makes it so much easier for us to do what is all too human: perform different identities, observe how other people react, and update our presentation of self to make us feel like we belong.
  • Partisan warfare, it seemed, is often more about status signaling and bonding than persuading others.
  • Proving one’s membership in a cult often becomes a sort of ritual in which members reward each other for take increasingly extreme positions to prove their loyalty to the cause.
  • Many people with strong partisan views do not participate in such destructive behavior. But the people who do often act this way because they feel marginalized, lonely, or disempowered in their off-line lives.
  • it is fair to say that most Americans are not as extreme as one might think after spending an hour or two on social media.
  • I calculated that people who identified themselves as “moderate,” “slightly liberal,” or “slightly conservative” were about 40 percent more likely to report being harassed online than those who identified themselves as “extreme liberals” or “extreme conservatives.”
  • false polarization can be driven by online experiences in which a minority of extremists come to represent a more moderate majority.
  • The link between social media usage and false polarization is also driven by the fact that extremists post far more often than moderates.
  • “prolific political tweeters make up just 6% of all Twitter users but generate 20% of all tweets and 73% of tweets mentioning national politics.”
  • The researchers found that only a small fraction of Facebook users shared fake news, and most of that news was shared by elderly conservatives.37
  • Perhaps even more surprisingly, there is also very little evidence that microtargeting ads influences consumer behavior.
  • The researchers found that of the remaining respondents, most consume information from a mix of liberal and conservative sources. The only people stuck deep inside echo chambers, they concluded, were the minority of people on the extremes of the liberal-conservative continuum.
  • DiMaggio and his colleagues discovered that the rates of disagreement about this and many other divisive issues did not increase between the 1970s and 1990s.
  • Though false polarization was negligible in 1970, the political scientists Adam Enders and Miles Armaly found that it grew by nearly 20 percent over the next forty years.5 What
  • One of the most important messages I’d like readers to take away from this book is that social media has sent false polarization into hyperdrive.
  • In other words, the most pernicious effects of the social media prism operate at a subconscious level.
  • the lack of moderate voices on social media may contribute more to political polarization than the abundance of extremists on our platforms, because the absence of the former enables the latter to hijack the public conversation.
  • Moderate people need to decide which issues are so important to them that they won’t allow extremists to speak on their behalf.
  • ask yourself what really motivates you: Is this an issue that you are willing to die on a hill for?
  • I showed that stepping outside your echo chamber and immediately exposing yourself to a broad range of people who do not share your views might just make you more convinced that your side is just, honest, or right.
  • Carolyn Sherif argued that our response will be conditioned by the distance between this argument and our preexisting views.
  • If, on the other hand, the argument is within what Sherif called our “latitude of acceptance” (a range of attitudes about a given issue that an individual finds acceptable or reasonable even if they don’t agree with them a priori), then people will be more motivated to engage with the viewpoint and perhaps even move closer to it.22
  • Research by the sociologist Robb Willer indicates that the best way to bridge partisan divides is to communicate using arguments that resonate with the worldviews of the people you are trying to persuade.
  • A recent study by a team of psychologists found that moderates are more accurate than extremists at gauging the ideological extremity of the other side—in other words, moderates are less susceptible to the downward spiral of false polarization.28
  • I think that there is room for a new platform for political discussion.
  • We decided to give our platform a generic name: DiscussIt.
  • Once users were matched, the app assigned them an androgynous name such as Jamie or Casey and directed them to the main chat interface. Users could then discuss topics in real time or asynchronously.
  • Sherif, “Social Categorization as a Function of Latitude of Acceptance and Series Range.” See also Berger, The Catalyst, which describes two related concepts: the zone of acceptance and the zone of rejection. 22.  See Sherif, “Social Categorization.” For a more recent argument that parallels the concept of the latitude of acceptance, see Levendusky, “When Efforts to Depolarize the Electorate Fail.”
  • Levendusky, Matthew S., and Neil Malhotra. “(Mis)perceptions of Partisan Polarization in the American Public.” Public Opinion Quarterly 80, no. S1 (January 1, 2016): 378–91. https://doi.org/10.1093/poq/nfv045.
mark grabe

Saved by mark grabe

on Dec 19, 21