§ HOW TO THINK BETTER · 13 MIN READ · Updated 2026-05-13

Confirmation Bias in Everyday Decisions

The most consequential cognitive bias in modern life — examined in detail, with the honest acknowledgment that it's also the hardest to escape.

"What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact."
Warren Buffett, Berkshire Hathaway Annual Meeting, 2002
Confirmation Bias in Everyday Decisions
CONFIRMATION BIAS IN EVERYDAY DECISIONS

Confirmation bias is the tendency to seek, interpret, and remember information that confirms our prior beliefs while overlooking or dismissing information that doesn't. It is, in the judgment of most cognitive psychologists, the most consequential bias in everyday decision-making — because it operates constantly, across all domains, and resists correction.

This article covers what confirmation bias is, how it shows up in different parts of life (investing, hiring, politics, personal relationships), why it exists, the empirical evidence, what (modestly) works to counter it, and the honest acknowledgment of how hard it is to escape.

What confirmation bias is

The basic mechanism: when we encounter information, we don't process it neutrally. We bring expectations, beliefs, and preferences. Information consistent with what we already believe is processed quickly, accepted easily, and retained well. Information inconsistent with what we believe is examined more critically, often dismissed, and forgotten faster.

This isn't a defect we can simply overcome by trying harder. It's a feature of how human cognition works.

The bias operates in three phases:

Phase 1 — Information search.

When we look for information, we look in places likely to confirm our existing beliefs. We read news sources that share our perspective. We search for terms framed in ways consistent with our views. We follow people on social media whose views resemble our own.

Phase 2 — Information interpretation.

When we encounter information, we interpret it through the lens of our existing beliefs. Ambiguous data is read as supporting our view. Strong data against our view is examined critically; weak data for our view is accepted easily.

Phase 3 — Information memory.

When we recall information later, we recall the confirming evidence more vividly and more completely than the disconfirming evidence. Over time, our memory of "what we've seen" skews systematically toward our prior beliefs.

The cumulative effect: even when objective evidence is mixed or neutral, our subjective experience of the evidence is biased toward our prior beliefs. This makes confirmation bias self-reinforcing — the bias produces evidence-shaped experiences that strengthen the bias.

How it shows up: four domains

Investing

A retail investor buys a stock because they believe in the company. They read positive news about the company eagerly. They dismiss negative news as short-term noise. They remember the times the stock went up; they explain away the times it went down. Over time, they hold losing positions far longer than they should because their experience of the stock is filtered through their belief that they were right to buy it.

Professional investors are not immune. Hedge fund managers who develop a thesis tend to seek confirming data. Equity analysts whose recommendations are public have strong incentives to maintain those recommendations and resist contrary evidence.

The counter: Pre-mortem analysis. Before making an investment, ask: what would I see in 12 months that would tell me I was wrong? Set tripwires that force review when contrary evidence emerges. Subscribe to research from analysts who disagree with you.

Hiring

A hiring manager interviews a candidate. The first ten minutes establish an impression. The rest of the interview is filtered through that impression. If the manager's first impression was favorable, ambiguous answers are read positively; minor signals supporting the impression are weighted heavily; minor signals against it are explained away. If the first impression was unfavorable, the reverse.

By the end of the interview, the manager often feels they have made a careful evaluation. In reality, they have largely confirmed their first impression.

The counter: Structured interviews with pre-specified questions and scoring rubrics. Multiple independent evaluators who don't share notes until separately scoring. Diverse interview panels. Reference checks designed to elicit specific behavioral examples, not impressions.

Politics

A reader of a particular political orientation reads news from sources aligned with that orientation. They view politicians from their side charitably and politicians from the other side uncharitably. When their side does something they would condemn from the other side, they find ways to distinguish the cases. When the other side does something they would defend from their own side, they find ways to condemn it.

This is well-documented empirically. Political confirmation bias is one of the strongest in the literature.

The counter: Deliberate consumption of well-argued opposing-view publications. Following thoughtful commentators across the spectrum rather than only those who agree with you. Application of the "pretzel test": before defending your side's action, ask whether you would defend the other side doing the same.

Personal relationships

We see what we expect to see in people we know. If we expect a colleague to be unreliable, we notice their failures and miss their successes. If we expect a friend to be supportive, we interpret ambiguous behavior charitably. Once we form impressions of people, those impressions shape what we then see, which reinforces the impressions.

This is the foundation of much interpersonal conflict — people don't update their impressions of each other based on new behavior; they continue to see old behavior shaped by old impressions.

The counter: Periodic explicit re-evaluation. "If I met this person today with no prior impression, what would I think?" Asking others (who don't share your impression) for their honest read. Specific feedback exchanges that bypass general impressions.

Why it exists

Three accounts, in increasing depth.

Account 1 — Computational efficiency.

The mind cannot re-evaluate every belief continuously. Filtering new information through existing beliefs is computationally efficient. Most of the time, our beliefs are roughly correct, and the filtering produces correct interpretations.

The cost: when our beliefs are wrong, the filtering perpetuates the error.

Account 2 — Argumentative reasoning (Mercier and Sperber).

Reasoning evolved partly for argumentation — convincing others and detecting weak arguments in opponents. In this evolutionary context, confirmation bias is adaptive: it makes us effective advocates for positions we already hold.

The cost: it makes us bad at updating our own beliefs.

Account 3 — Identity and social belonging.

Our beliefs are tied to our identities and group memberships. Changing beliefs threatens these. Confirmation bias protects identity at the cost of accuracy.

In contexts where group membership is highly salient (politics, religion, intense ideological communities), confirmation bias is strongest. In contexts where accuracy is more directly rewarded (forecasting, scientific research with falsification mechanisms), confirmation bias is somewhat weaker.

What works (modestly)

The honest answer: not much. Confirmation bias is among the most stubborn cognitive biases. Methods that produce measurable but modest reductions:

Method 1 — Pre-commit to specific tests.

Before forming a strong belief, ask: what evidence would change my mind? Specify the test in advance. When the evidence comes in, you've committed to updating.

This works because it shifts the decision point from "is this evidence convincing?" (where confirmation bias operates) to "did the pre-specified condition occur?" (which is more procedural).

Method 2 — Engage with strongest opposing view.

Read the smartest advocates of positions you disagree with. Not the easiest targets — the strongest cases. If you can't refute the strongest version of a view, you don't fully refute the view.

Method 3 — Maintain a calibrated record.

Make predictions with explicit probabilities. Track your predictions over time. Calibrate to reality.

This is the superforecasting method. It works because reality provides feedback that bypasses your interpretation.

Method 4 — Adversarial collaboration.

Work with someone who disagrees with you. Specifically: write a joint document where each of you can state your view and the other can respond. This forces engagement with the strongest version of the opposing view.

Method 5 — Defer to procedure.

In areas where biases are particularly dangerous (medical diagnosis, hiring, investment decisions), defer to checklists and procedures rather than to your judgment alone. Procedures are not perfect, but they're more bias-resistant than unstructured judgment.

The honest acknowledgment

Even with these methods, confirmation bias persists. The research is clear: simply knowing about confirmation bias doesn't reliably reduce it in your own thinking. People who have read about confirmation bias still apply it primarily to others' thinking, not their own. The bias operates below conscious awareness; the consciousness can be enlisted but not commanded.

The realistic goal: not eliminating confirmation bias, but creating institutional and personal practices that limit its damage. Structured procedures, pre-commitments, calibration practices, diverse inputs — these reduce but don't eliminate the bias.

This is humbling. It also points to why epistemically important institutions (peer review, adversarial legal proceedings, double-blind studies, free press, opposition parties) matter — they create structures that limit the damage of confirmation bias when individual cognition cannot.


Frequently asked

Is confirmation bias the same as motivated reasoning?
They overlap but are not identical. Confirmation bias is the unconscious filtering of information toward existing beliefs. Motivated reasoning is the active recruitment of arguments and evidence in support of a desired conclusion. Motivated reasoning is more deliberate; confirmation bias is more automatic. In practice, the two work together.
Are some people less susceptible to confirmation bias?
Modestly, yes. People with higher cognitive ability, those trained in scientific methodology, and those in fields with strong feedback loops (calibrated forecasters, professional analysts) show somewhat less confirmation bias on average. But "less" is not "none" — everyone is susceptible.
Why does confirmation bias persist if we know about it?
Because knowing about a bias doesn't reliably correct it. The bias operates below conscious awareness. Consciously trying not to be biased doesn't bypass the automatic filtering. The cognitive systems that produce the bias are not under conscious control.
Does confirmation bias affect scientific research?
Yes. Researchers tend to interpret data in ways favorable to their hypotheses, choose which results to publish (publication bias), and design follow-up studies in ways that confirm earlier findings. Science manages confirmation bias through institutional structures (peer review, replication, pre-registration of studies), not through researchers being unbiased.
Is openness to different views a personality trait or a skill?
Both. Some people are temperamentally more open to opposing views. But openness can also be developed through practice. The skills of steel-manning, calibrated forecasting, and adversarial collaboration can be learned regardless of temperament.
Does confirmation bias get worse in echo chambers?
Yes, significantly. When most of the information you encounter confirms your views, confirmation bias is reinforced more strongly. This is one reason intellectual diversity in information sources matters — not just for diversity's sake, but for accuracy.

— ACT —


Cited works & further reading

  • ·Nickerson, R. (1998). "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises." Review of General Psychology. — Standard scholarly review.
  • ·Mercier, H. and Sperber, D. (2017). The Enigma of Reason. Harvard University Press.
  • ·Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.
  • ·Tetlock, P. and Gardner, D. (2015). Superforecasting. Crown.

More from this cluster


About the author

Tim Sheludyakov writes the Stoa library.

By Tim Sheludyakov · Edited 2026-05-13

A letter from the portico

Once a week — a long-read, a quote, a practice. No promotions. Unsubscribe in one click.

By subscribing you agree to receive letters from Stoa.