Researchers at Facebook were concerned about this possibility. Here's why: your Facebook news feed works on a precise algorithm that determines what you see based on its popularity, relevance, and a nebulous "human element." While those qualities help keep you engaged on the site, they also may keep you from seeing content you don't agree with—particularly political news from the other side of the aisle. For a study published in the journal Science in 2015, Facebook researchers set out to examine the effects of this potential filter bubble. Over a six-month period, they surveyed the activity of 10 million anonymized Facebook users who had included their political affiliation in their profiles. What they were watching for was how often a news story considered "cross-cutting" (that is, one more likely to be posted by someone with the opposite political viewpoint) appeared on a user's wall, and how often it was filtered out by the algorithm. The results? Facebook's algorithm only makes it 1% less likely for a user to be exposed to a politically cross-cutting story. The team concluded, "The power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals." In other words, it's not the algorithm—it's us.
Google's search algorithms, likewise, probably don't personalize your results as much as people think they do. A spokesman for Google told Slate's Jacob Weisberg, "We actually have algorithms in place designed specifically to limit personalization and promote variety in the results page." Likewise, Harvard law and computer science professor Jonathan Zittrain told Weisberg, "In my experience, the effects of search personalization have been light." Still, unlike with Facebook, you can easily turn off Google's personalization feature if you're wary.
It takes time and effort to avoid confirmation bias and actively seek out the stories you don't agree with, but a well-rounded political education is possible. Learn more about avoiding bias and expanding your perspectives in the videos below.