The Internet's Filter Bubble Isn't As Strong As You Think

The Internet's Filter Bubble Isn't As Strong As You Think

Thomas Jefferson once said, "The cornerstone of democracy rests on the foundation of an educated electorate." Part of being an educated citizen includes exposing yourself to ideas you disagree with, since they give you the chance to change your mind, or at least understand the other side. But in a world where everyone gets their news and information from an internet that's becoming more and more personalized, there's a fear that we're seeing fewer things we disagree with. That digital echo chamber has been dubbed the "filter bubble" by internet activist Eli Pariser. But how strong is the filter bubble, really? Are computer algorithms really to blame for political polarization?

Researchers at Facebook were concerned about this possibility. Here's why: your Facebook news feed works on a precise algorithm that determines what you see based on its popularity, relevance, and a nebulous "human element." While those qualities help keep you engaged on the site, they also may keep you from seeing content you don't agree with—particularly political news from the other side of the aisle. For a study published in the journal Science in 2015, Facebook researchers set out to examine the effects of this potential filter bubble. Over a six-month period, they surveyed the activity of 10 million anonymized Facebook users who had included their political affiliation in their profiles. What they were watching for was how often a news story considered "cross-cutting" (that is, one more likely to be posted by someone with the opposite political viewpoint) appeared on a user's wall, and how often it was filtered out by the algorithm. The results? Facebook's algorithm only makes it 1% less likely for a user to be exposed to a politically cross-cutting story. The team concluded, "The power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals." In other words, it's not the algorithm—it's us.

Google's search algorithms, likewise, probably don't personalize your results as much as people think they do. A spokesman for Google told Slate's Jacob Weisberg, "We actually have algorithms in place designed specifically to limit personalization and promote variety in the results page." Likewise, Harvard law and computer science professor Jonathan Zittrain told Weisberg, "In my experience, the effects of search personalization have been light." Still, unlike with Facebook, you can easily turn off Google's personalization feature if you're wary.

It takes time and effort to avoid confirmation bias and actively seek out the stories you don't agree with, but a well-rounded political education is possible. Learn more about avoiding bias and expanding your perspectives in the videos below.

The Social Media Echo Chamber

Suraj Patel explains how most people use Twitter to reinforce their beliefs.

Curiosity Can Cure Bias

A 2016 Yale study found that the more scientific curiosity a person had, the more likely they were to change their views on a politically polarizing topic. The more scientific knowledge they had, the less likely they were to change their views.

How Morals Influence Your Political Leanings

Experimental social psychologist Peter Ditto explains how moral foundation theory can predict your political ideology.

03:16

from Fig. 1 by University of California

Key Facts In This Video

  • 1

    The five pillars of morality are: (0:33)

  • 2

    Liberals focus on harm and fairness when deciding if something is morally acceptable. (1:08)

  • 3

    Libertarians show lower empathy than both liberals and conservatives. (2:28)

See all

Communication

Internet

Social Sciences

Culture

Get smarter every day! Like us on Facebook.
You'll get the most interesting and engaging topics in your feed, straight from our team of experts.