Cognitive Bias Susceptibility Test
Which Cognitive Biases Control Your Thinking?
Your brain is not a logic machine — it's a shortcut machine. Cognitive biases are the mental shortcuts that help you think fast but sometimes think wrong. Everyone has them, but not everyone has the same ones.
This test measures your susceptibility to five of the most well-documented cognitive biases. Your scores are normed against population data — higher means more susceptible, not smarter or dumber.
Question 1 of 25
I tend to search for news sources that align with my current political or social views.
Strongly Disagree
Strongly Agree
Cognitive biases are systematic patterns of deviation from rationality in judgment. They are not signs of stupidity — they are side effects of the mental shortcuts your brain uses to process an overwhelming world quickly. The five biases in this test come from several different research traditions spanning decades. Confirmation Bias was first demonstrated by Peter Wason1 in 1960. Anchoring and the Availability Heuristic were identified by Tversky and Kahneman2 3 in the 1970s as part of their heuristics-and-biases program. The Sunk Cost Fallacy was formalized by Thaler4 and Arkes and Blumer5. The Dunning-Kruger effect was published in 1999 by Kruger and Dunning6 at Cornell. No single researcher catalogued all five — they emerged independently across different fields.
Confirmation Bias is the tendency to seek, interpret, and remember information in ways that confirm your pre-existing beliefs. It affects everything from which news articles you click to how you evaluate a job candidate. Anchoring Bias occurs when the first piece of information you encounter — a price tag, a statistic, a suggestion — disproportionately influences all subsequent judgments, even if that anchor was arbitrary. Negotiators, marketers, and courtroom attorneys exploit this bias routinely.
The Sunk Cost Fallacy is the irrational tendency to continue investing in something because of resources already spent, rather than evaluating the decision purely on its future value. It keeps people in bad relationships, doomed projects, and losing investments long past the point of reason. The Availability Heuristic makes you estimate the probability of events based on how easily you can recall vivid examples, which is why plane crashes feel more dangerous than heart disease even though the statistics say otherwise. Dramatic, emotional, or recent events get overweighted because they are easy to picture.
The Dunning-Kruger Tendency describes the pattern where people with limited knowledge in a domain overestimate their own competence — precisely because they lack the expertise to recognize what they don't know. This pattern has been replicated across many domains, though it is worth noting that Gignac and Zajenkowski7 have shown it can partly emerge as a statistical artifact of bounded scales and regression to the mean, rather than a purely psychological mechanism. The debate is ongoing. These five biases interact in subtle ways. Confirmation Bias can reinforce overconfidence by filtering out feedback that might reveal incompetence, while Anchoring Bias can amplify the Sunk Cost Fallacy by locking you onto your original investment figure.
The practical value of knowing your bias profile is self-awareness. People who know they are prone to Anchoring, for example, can train themselves to generate independent estimates before looking at reference numbers. Those who recognize their Sunk Cost vulnerability can build "kill criteria" into projects before emotional attachment takes hold. And those with high Availability Heuristic scores can learn to pause and check actual statistics before making risk judgments based on vivid memories alone. Awareness does not eliminate biases, but it gives you a chance to catch them before they drive decisions.
This test uses 25 Likert-scale items, five per construct, with reverse-scored items to control for response bias. Your raw responses are converted into factor scores using empirically derived loadings and then mapped to population-normed percentiles. A percentile of 80 on Confirmation Bias means you endorsed confirmation-seeking patterns more strongly than roughly 80 percent of the norming sample. Each bias is measured independently, so your profile may reveal surprising combinations — high susceptibility to one bias and strong resistance to another.
Footnotes
-
Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12(3), 129–140. doi:10.1080/17470216008416717 ↩
-
Tversky, A. & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232. doi:10.1016/0010-0285(73)90033-9 ↩
-
Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131. doi:10.1126/science.185.4157.1124 ↩
-
Thaler, R. (1980). Toward a positive theory of consumer choice. Journal of Economic Behavior & Organization, 1(1), 39–60. doi:10.1016/0167-2681(80)90051-7 ↩
-
Arkes, H. R. & Blumer, C. (1985). The psychology of sunk cost. Organizational Behavior and Human Decision Processes, 35(1), 124–140. doi:10.1016/0749-5978(85)90049-4 ↩
-
Kruger, J. & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. doi:10.1037/0022-3514.77.6.1121 ↩
-
Gignac, G. E. & Zajenkowski, M. (2020). The Dunning-Kruger effect is (mostly) a statistical artefact. Intelligence, 80, 101449. doi:10.1016/j.intell.2020.101449 ↩

Why Use This Test?
- Everyone thinks they're rational. This test shows you exactly where your thinking goes wrong. Measure your susceptibility to Confirmation Bias, Anchoring, Sunk Cost Fallacy, Availability Heuristic, and the Dunning-Kruger effect — with real percentile scores, not vague labels.