Cognitive reflection test

Last updated

In psychology, the cognitive reflection test (CRT) is a task designed to measure a person's tendency to override an incorrect "gut" response and engage in further reflection to find a correct answer. However, the validity of the assessment as a measure of "cognitive reflection" or "intuitive thinking" is under question. [1] It was first described in 2005 by Shane Frederick. The CRT has a moderate positive correlation with measures of intelligence, such as the IQ test, and it correlates highly with various measures of mental heuristics. [2] [3] [4] [5] Some researchers argue that the CRT is actually measuring cognitive abilities (colloquially known as intelligence). [6]

Contents

Later research has shown that the CRT is a multifaceted construct: many start their response with the correct answer, while others fail to solve the test even if they reflect on their intuitive first answer. It has also been argued that suppression of the first answer is not the only factor behind the successful performance on the CRT; numeracy and reflectivity both account for performance. [7]

Basis of test

According to Frederick, there are two general types of cognitive activity called "system 1" and "system 2" (these terms have been first used by Keith Stanovich and Richard West [8] ). System 1 is executed quickly without reflection, while system 2 requires conscious thought and effort. The cognitive reflection test has three questions that each have an obvious but incorrect response given by system 1. The correct response requires the activation of system 2. For system 2 to be activated, a person must note that their first answer is incorrect, which requires reflection on their own cognition. [2]

Correlating measures

The test has been found to correlate with many measures of economic thinking, such as numeracy, [7] temporal discounting, risk preference, and gambling preference. [2] It has also been correlated with measures of mental heuristics, such as the gambler's fallacy, understanding of regression to the mean, the sunk cost fallacy, and others. [3] [4] [5]

Keith Stanovich found that cognitive ability is not strongly correlated with CRT scores because it will only lead to better CRT performance under certain conditions. First, the test-taker must recognize the need to override their system 1 response, and then they must have available cognitive resources to carry out the override. If the test-taker does not need to inhibit system 1 for the override, then the system 2 response immediately follows. Otherwise, they must have the capacity to sustain inhibition of system 1 in order to engage the system 2 response. [9] Contrarily, some researchers have assessed the validity of the assessment, using an advanced item response theory method, and found that the CRT likely measures cognitive ability. [1] The authors of the study explain the validity of the CRT has been questioned due to the lack of validity studies and the lack of a psychometric approach.

Test questions and answers

The original test penned by Frederick contained only the three following questions: [2]

  1. A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?
  2. If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?
  3. In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

The intuitive answers to these questions that "system 1" gives typically are: 10 cents, 100 minutes, and 24 days; while the correct solutions are: 5 cents, 5 minutes, and 47 days.

Neurodiversity and Reflection

Research applying the CRT to neurodivergent populations, especially autism, has explored differences in intuitive versus deliberative processing. Initial studies [10] [11] [12] have been formative to the Dual Process Theory of Autism, [13] indicating that autistic adults or those with high autistic traits show fewer intuitive incorrect responses and a tendency towards greater reflective processing on CRT relative to neurotypical peers. However, the evidence has been mixed, as autistic individuals self-report lower levels of intuition, objective performance including deliberation measure is comparable neurotypical control. [14] Extending this to a larger population, however, neurodivergent students were found to be equivalent to neurotypical students on CRT across cultures. [15] But once these groups were split into autism, ADHD, dyspraxia, dyscalculia and dyslexia. Dyscalculic students scored lower on CRT deliberative performance than neurotypical adults, while dyslexic adults scored higher on CRT deliberative performance than neurotypical adults. This indicates that academic challenges in this group might be more closely related to high levels of co-occurring anxiety, as opposed to an impairment in their capacity for cognitive reflection. [15]

Limitations and alternatives

Studies have estimated that between 44 and 51% of research participants have previously been exposed to the CRT. [16] [17] Those participants that are familiar with the CRT tend to outscore those with no previous exposure, which raises questions about the validity of the measure in this population. [16] [17] In an effort to combat limitations associated with familiarity, researchers have developed a variety of alternative measures of cognitive reflection. [18] [19] [20] Recent research, however, suggests that the CRT is robust to multiple exposure, so that despite the raw score increases in experienced participants, its correlations with other variables remain unaffected. [21]

Another limitation is due to a lack of strong psychometric properties and scarcity of validity studies in the literature. [1] The CRT was not designed in a manner that aligns with standards of the industry such as the Standards for Educational and Psychological Testing which was developed by the American Educational Research Association, American Psychological Association, & National Council on Measurement in Education.

See also

References

  1. 1 2 3 Blacksmith, Nikki; Yang, Yongwei; Behrend, Tara S.; Ruark, Gregory A. (2019). "Assessing the validity of inferences from scores on the cognitive reflection test" . Journal of Behavioral Decision Making. 32 (5): 599–612. doi:10.1002/bdm.2133. ISSN   1099-0771. S2CID   197706996.
  2. 1 2 3 4 Frederick, Shane (2005). "Cognitive Reflection and Decision Making". Journal of Economic Perspectives. 19 (4): 25–42. doi: 10.1257/089533005775196732 .
  3. 1 2 Oechssler, Jörg; Roider, Andreas; Schmitz, Patrick W. (2009). "Cognitive abilities and behavioral biases" (PDF). Journal of Economic Behavior & Organization. 72 (1): 147–152. doi:10.1016/j.jebo.2009.04.018. ISSN   0167-2681.
  4. 1 2 Hoppe, Eva I.; Kusterer, David J. (2011). "Behavioral biases and cognitive reflection". Economics Letters. 110 (2): 97–100. doi:10.1016/j.econlet.2010.11.015. ISSN   0165-1765.
  5. 1 2 Toplak, Maggie (4 May 2011). "The Cognitive Reflection Test as a predictor of performance on heuristics-and-biases tasks" (PDF). Memory and Cognition. 39 (7): 1275–1289. doi: 10.3758/s13421-011-0104-1 . PMID   21541821 . Retrieved 30 May 2014.
  6. Blacksmith, Nikki; Yang, Yongwei; Behrend, Tara S.; Ruark, Gregory A. (2019). "Assessing the validity of inferences from scores on the cognitive reflection test" . Journal of Behavioral Decision Making. 32 (5): 599–612. doi:10.1002/bdm.2133. ISSN   1099-0771. S2CID   197706996.
  7. 1 2 Szaszi, B., Szollosi, A., Palfi, B., Aczél B., (2017) The cognitive reflection test revisited: exploring the ways individuals solve the test, Thinking and Reasoning, https://www.tandfonline.com/doi/abs/10.1080/13546783.2017.1292954
  8. Stanovich, K. E., & West, R. F. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23(5), 645-665. https://doi.org/10.1017/s0140525x00003435
  9. Stanovich, Keith E., & West, Richard F. (2008). "On the relative independence of thinking biases and cognitive ability." Personality Processes and Individual Differences, 94(4), 672-695. doi:10.1037/0022-3514.94.4.672 http://psycnet.apa.org/journals/psp/94/4/672
  10. Brosnan, Mark; Ashwin, Chris (2023-07-01). "Thinking, fast and slow on the autism spectrum". Autism. 27 (5): 1245–1255. doi:10.1177/13623613221132437. ISSN   1362-3613. PMC   10291371 . PMID   36325717.
  11. Brosnan, Mark; Lewton, Marcus; Ashwin, Chris (2016-06-01). "Reasoning on the Autism Spectrum: A Dual Process Theory Account". Journal of Autism and Developmental Disorders. 46 (6): 2115–2125. doi:10.1007/s10803-016-2742-4. ISSN   1573-3432. PMC   4860198 . PMID   26960339.
  12. Brosnan, Mark; Ashwin, Chris; Lewton, Marcus (2017-08-01). "Brief Report: Intuitive and Reflective Reasoning in Autism Spectrum Disorder" . Journal of Autism and Developmental Disorders. 47 (8): 2595–2601. doi:10.1007/s10803-017-3131-3. ISSN   1573-3432.
  13. Brosnan, Mark; Ashwin, Chris (2023-11-01). "Differences in Art Appreciation in Autism: A Measure of Reduced Intuitive Processing". Journal of Autism and Developmental Disorders. 53 (11): 4382–4389. doi:10.1007/s10803-022-05733-6. ISSN   1573-3432. PMC   10539443 . PMID   36063312.
  14. Bastan, Elif; Beck, Sarah R; Surtees, Andrew DR (2025-02-01). "Autistic people differ from non-autistic people subjectively, but not objectively in their reasoning". Autism. 29 (2): 355–366. doi:10.1177/13623613241277055. ISSN   1362-3613. PMC   11816476 . PMID   39387554.
  15. 1 2 Mahak, Sheeza; Malone, Stephanie; Elsherif, Mahmoud; Hand, Christopher J.; Morsanyi, Kinga (2025-10-27). "Academic anxiety and cognitive reflection in neurodivergence based on evidence from a large international sample". Scientific Reports. 15 (1): 37522. doi:10.1038/s41598-025-21504-6. ISSN   2045-2322. PMC   12559732 .
  16. 1 2 Haigh, Matthew (2016). "Has the Standard Cognitive Reflection Test Become a Victim of Its Own Success?". Advances in Cognitive Psychology. 12 (3): 145–149. doi:10.5709/acp-0193-5. PMC   5225989 . PMID   28115997.
  17. 1 2 Stieger, Stefan; Reips, Ulf-Dietrich (2016-09-06). "A limitation of the Cognitive Reflection Test: familiarity". PeerJ. 4 e2395. doi: 10.7717/peerj.2395 . ISSN   2167-8359. PMC   5018679 . PMID   27651989.
  18. Primi, Caterina; Morsanyi, Kinga; Chiesi, Francesca; Donati, Maria Anna; Hamilton, Jayne (2016-12-01). "The Development and Testing of a New Version of the Cognitive Reflection Test Applying Item Response Theory (IRT)". Journal of Behavioral Decision Making. 29 (5): 453–469. doi:10.1002/bdm.1883. hdl: 2158/1011727 . ISSN   1099-0771. S2CID   56252490.
  19. Toplak, Maggie E.; West, Richard F.; Stanovich, Keith E. (2014-04-03). "Assessing miserly information processing: An expansion of the Cognitive Reflection Test". Thinking & Reasoning. 20 (2): 147–168. doi:10.1080/13546783.2013.844729. ISSN   1354-6783. S2CID   53340418.
  20. Thomson, Keela S.; Oppenheimer, Daniel M. (2016). "Investigating an alternate form of the cognitive reflection test". Judgment and Decision Making. 11: 99–113. doi: 10.1017/S1930297500007622 . S2CID   146924609.
  21. Bialek, Michal; Pennycook, Gordon (2017-08-28). "The Cognitive Reflection Test is robust to multiple exposures". Behavior Research Methods. 50 (5): 1953–1959. doi: 10.3758/s13428-017-0963-x . PMID   28849403.