Commentary

Harvard economist Roland Fryer: more information makes people more polarized, not less

Apr 21, 2025

Key Points

  • Harvard economist Roland Fryer's research shows people unconsciously interpret ambiguous evidence as confirming their existing beliefs, causing polarization to intensify as information abundance increases.
  • In Fryer's experiments, subjects exposed to balanced research on climate change and the death penalty ended more polarized than they started, mentally counting neutral signals as supporting their prior views.
  • The finding suggests information abundance, not algorithmic sorting or echo chambers, drives political division—two people seeing identical facts can reach opposite conclusions through the same distorted lens.

Summary

Harvard economist Roland Fryer's research on polarization exposes a mechanism that explains why more information makes people more divided, not less. The problem isn't echo chambers or algorithmic sorting. It's how people interpret ambiguous evidence.

When people encounter ambiguous signals that could support either side of a factual question, they instinctively interpret them as confirming whatever they already believed. In an experiment with 600 subjects presented with neutral research summaries on climate change and the death penalty, over half ended their participation with more extreme beliefs than when they started, despite seeing evidence that was explicitly balanced.

The mechanism works like this. Imagine the truth is either A or B, with no gray area. People start with a prior belief and then observe signals. Some point clearly to A, some to B, and some are ambiguous. A fully rational actor would count ambiguous signals as neutral. Instead, people unconsciously count ambiguous signals as supporting their prior. If you started believing A, you would see three clear A signals, three clear B signals, and three ambiguous signals, but mentally count the ambiguous ones as A. The result is a 6-to-3 pattern confirming your original view, when the actual evidence was split evenly.

This pattern compounds. Future ambiguous signals will be skewed the same way. Observation after observation entrenches the view rather than correcting it. Fryer's mathematical result shows that when enough real-world information is open to interpretation, a car drifting slightly into your lane could demonstrate either a genuine threat or normal driving variation. Two people with different starting beliefs can observe the identical sequence of events and end up absolutely certain of opposite conclusions.

In a world where information is plentiful, people become more divided, not less. This holds even if everyone sees the same information, though in practice they don't. They choose Fox News or MSNBC. This potentially reframes the Facebook polarization debate. Zuckerberg argued that polarization in the US is not driven by Facebook's design because countries with high Facebook usage, like Malaysia and Indonesia, haven't seen the same political division. Fryer's work suggests the real driver may simply be information abundance. In a cave with no information, you stay unpolarized. Expose people to plentiful, ambiguous data, and prior beliefs act as a lens that distorts interpretation until opposing camps see opposite truths in the same facts.