Abstract
Individuals rely on others’ expertise to achieve a basic understanding of the world. But how can non-experts achieve understanding from explanations that, by definition, they are ill-equipped to assess? Across 9 experiments with 6,698 participants (Study 1A = 737; 1B = 734; 1C = 733; 2A = 1,014; 2B = 509; 2C = 1,012; 3A = 1,026; 3B = 512; 4 = 421), we address this puzzle by focusing on scientific explanations with jargon. We identify ‘when’ and ‘why’ the inclusion of jargon makes explanations more satisfying, despite decreasing their comprehensibility. We find that jargon increases satisfaction because laypeople assume the jargon fills gaps in explanations that are otherwise incomplete. We also identify strategies for debiasing these judgements: when people attempt to generate their own explanations, inflated judgements of poor explanations with jargon are reduced, and people become better calibrated in their assessments of their own ability to explain.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$32.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
Data availability
Data for all studies are publicly available in a dedicated OSF folder at https://osf.io/ytakw/?view_only=f3c34c42f79d4ecca2ab5502c35c0591 (ref. 86).
Code availability
Code for all analyses, figures and tables in both the main text and Supplementary Information is publicly available in a dedicated OSF folder at https://osf.io/ytakw/?view_only=f3c34c42f79d4ecca2ab5502c35c0591 (ref. 86).
References
Keil, F. C. Running on empty? How folk science gets by with less. Curr. Dir. Psychol. Sci. 21, 329–334 (2012).
Sloman, S. & Fernbach, P. The Knowledge Illusion: Why We Never Think Alone (Penguin, 2018).
Ballantyne, N. Knowing Our Limits (Oxford Univ. Press, 2019).
DiPaolo, J. What’s wrong with epistemic trespassing? Philos. Stud. 179, 223–243 (2022).
DiPaolo, J. Who knows what? Epistemic dependence, inquiry, and function-first epistemology. Inquiry 67, 670–687 (2024).
Peltokorpi, V. Transactive memory systems. Rev. Gen. Psychol. 12, 378–394 (2008).
Ren, Y. & Argote, L. Transactive memory systems 1985–2010: an integrative framework of key dimensions, antecedents, and consequences. Acad. Manage. Ann. 5, 189–229 (2011).
Wegner, D. M. in Theories of Group Behavior (eds Mullen, B. & Goethals, G. R.) 185–208 (Springer, 1987).
Bromme, R. & Thomm, E. Knowing who knows: laypersons’ capabilities to judge experts’ pertinence for science topics. Cogn. Sci. 40, 241–252 (2016).
Wilkenfeld, D. A., Plunkett, D. & Lombrozo, T. Depth and deference: when and why we attribute understanding. Philos. Stud. 173, 373–393 (2016).
Sparrow, B., Liu, J. & Wegner, D. M. Google effects on memory: cognitive consequences of having information at our fingertips. Science 333, 776–778 (2011).
Bromme, R., Rambow, R. & Nückles, M. Expertise and estimating what other people know: the influence of professional experience and type of knowledge. J. Exp. Psychol. Appl. 7, 317–330 (2001).
Watson, J. C. Epistemic neighbors: trespassing and the range of expert authority. Synthese 200, 408 (2022).
Lilienfeld, S. O. Can psychology become a science? Pers. Individ. Dif. 49, 281–288 (2010).
Boyd, K. Trusting scientific experts in an online world. Synthese 200, 14 (2022).
Lilienfeld, S. O. Public skepticism of psychology: why many people perceive the study of human behavior as unscientific. Am. Psychol. 67, 111–129 (2012).
Aslanov, I. & Guerra, E. Tautological formal explanations: does prior knowledge affect their satisfiability? Front. Psychol. 14, 1258985 (2023).
Aslanov, I. A., Sudorgina, Y. V. & Kotov, A. A. The explanatory effect of a label: its influence on a category persists even if we forget the label. Front. Psychol. 12, 745586 (2022).
Bennett, E. M. & McLaughlin, P. J. Neuroscience explanations really do satisfy: a systematic review and meta-analysis of the seductive allure of neuroscience. Public Underst. Sci. 33, 290–307 (2024).
Bulut, N. S., Gürsoy, S. C., Yorguner, N., Çarkaxhiu Bulut, G. & Sayar, K. The seductive allure effect extends from neuroscientific to psychoanalytic explanations among Turkish medical students: preliminary implications of biased scientific reasoning within the context of medical and psychiatric training. Think. Reason. 28, 625–644 (2022).
Eriksson, K. The nonsense math effect. Judgm. Decis. Mak. 7, 746–749 (2012).
Fernandez-Duque, D., Evans, J., Christian, C. & Hodges, S. D. Superfluous neuroscience information makes explanations of psychological phenomena more appealing. J. Cogn. Neurosci. 27, 926–944 (2015).
Giffin, C., Wilkenfeld, D. & Lombrozo, T. The explanatory effect of a label: explanations with named categories are more satisfying. Cognition 168, 357–369 (2017).
Hemmatian, B. & Sloman, S. A. Community appeal: explanation without information. J. Exp. Psychol. Gen. 147, 1677–1712 (2018).
Hopkins, E. J., Weisberg, D. S. & Taylor, J. C. V. The seductive allure is a reductive allure: people prefer scientific explanations that contain logically irrelevant reductive information. Cognition 155, 67–76 (2016).
Hopkins, E. J., Weisberg, D. S. & Taylor, J. C. V. Does expertise moderate the seductive allure of reductive explanations? Acta Psychol. 198, 102890 (2019).
Liquin, E. G. & Lombrozo, T. Motivated to learn: an account of explanatory satisfaction. Cogn. Psychol. 132, 101453 (2022).
Minahan, J. & Siedlecki, K. L. Individual differences in need for cognition influence the evaluation of circular scientific explanations. Pers. Individ. Dif. 99, 113–117 (2016).
Rhodes, R. E., Rodriguez, F. & Shah, P. Explaining the alluring influence of neuroscience information on scientific reasoning. J. Exp. Psychol. Learn. Mem. Cogn. 40, 1432–1440 (2014).
Weisberg, D. S., Hopkins, E. J. & Taylor, J. C. V. People’s explanatory preferences for scientific phenomena. Cogn. Res. 3, 44 (2018).
Weisberg, D. S., Taylor, J. C. V. & Hopkins, E. J. Deconstructing the seductive allure of neuroscience explanations. Judgm. Decis. Mak. 10, 429–441 (2015).
Weisberg, D. S., Keil, F. C., Goodstein, J., Rawson, E. & Gray, J. R. The seductive allure of neuroscience explanations. J. Cogn. Neurosci. 20, 470–477 (2008).
Bullock, O. M., Colón Amill, D., Shulman, H. C. & Dixon, G. N. Jargon as a barrier to effective science communication: evidence from metacognition. Public Underst. Sci. 28, 845–853 (2019).
Scharrer, L., Bromme, R. & Stadtler, M. Information easiness affects non-experts’ evaluation of scientific claims about which they hold prior beliefs. Front. Psychol. 12, 678313 (2021).
Scharrer, L., Pape, V. & Stadtler, M. Watch out: fake! How warning labels affect laypeople’s evaluation of simplified scientific misinformation. Discourse Process. 59, 575–590 (2022).
Scharrer, L., Stadtler, M. & Bromme, R. You’d better ask an expert: mitigating the comprehensibility effect on laypeople’s decisions about science-based knowledge claims. Appl. Cogn. Psychol. 28, 465–471 (2014).
Scharrer, L., Stadtler, M. & Bromme, R. Judging scientific information: does source evaluation prevent the seductive effect of text easiness? Learn. Instr. 63, 101215 (2019).
Scharrer, L., Britt, M. A., Stadtler, M. & Bromme, R. Easy to understand but difficult to decide: information comprehensibility and controversiality affect laypeople’s science-based decisions. Discourse Process. 50, 361–387 (2013).
Scharrer, L., Bromme, R., Britt, M. A. & Stadtler, M. The seduction of easiness: how science depictions influence laypeople’s reliance on their own evaluation of scientific information. Learn. Instr. 22, 231–243 (2012).
Rozenblit, L. & Keil, F. The misunderstood limits of folk science: an illusion of explanatory depth. Cogn. Sci. 26, 521–562 (2002).
Bromme, R., Thomm, E. & Ratermann, K. Who knows? Explaining impacts on the assessment of our own knowledge and of the knowledge of experts. Z. Pädagog. Psychol. 30, 97–108 (2016).
Meyers, E. A., Gretton, J. D., Budge, J. R. C., Fugelsang, J. A. & Koehler, D. J. Broad effects of shallow understanding: explaining an unrelated phenomenon exposes the illusion of explanatory depth. Judgm. Decis. Mak. 18, e24 (2023).
Kuznetsova, A., Brockhoff, P. B. & Christensen, R. H. B. lmerTest package: tests in linear mixed effects models. J. Stat. Softw. 82, 1–26 (2017).
Montoya, A. K. Probing moderation analysis in two-instance repeated-measures designs. Multivar. Behav. Res. 53, 140–141 (2018).
Diedenhofen, B. & Musch, J. cocor: a comprehensive solution for the statistical comparison of correlations. PLoS ONE 10, e0121945 (2015).
Zou, G. Y. Toward using confidence intervals to compare correlations. Psychol. Methods 12, 399–413 (2007).
Bromme, R. & Goldman, S. R. The public’s bounded understanding of science. Educ. Psychol. 49, 59–69 (2014).
Kelemen, D., Rottman, J. & Seston, R. Professional physical scientists display tenacious teleological tendencies: purpose-based reasoning as a cognitive default. J. Exp. Psychol. Gen. 142, 1074–1083 (2013).
Goldberg, R. F. & Thompson-Schill, S. L. Developmental ‘roots’ in mature biological knowledge. Psychol. Sci. 20, 480–487 (2009).
Shtulman, A. Qualitative differences between naïve and scientific theories of evolution. Cogn. Psychol. 52, 170–194 (2006).
Shtulman, A. & Valcarcel, J. Scientific knowledge suppresses but does not supplant earlier intuitions. Cognition 124, 209–215 (2012).
Kovaka, K. Climate change denial and beliefs about science. Synthese 198, 2355–2374 (2021).
Anderson, C., Brion, S., Moore, D. A. & Kennedy, J. A. A status-enhancement account of overconfidence. J. Pers Soc. Psychol. 103, 718–735 (2012).
Cheever, N. A. & Rokkum, J. in The Wiley Handbook of Psychology, Technology, and Society (eds Rosen, L. D. et al.) 56–73 (Wiley, 2015).
Hofer, B. K. Epistemological understanding as a metacognitive process: thinking aloud during online searching. Educ. Psychol. 39, 43–55 (2004).
Fonseca, B. & Chi, M. in Handbook of Research on Learning and Instruction (eds Mayer, R. E. & Alexander, P. A.) 296–312 (Routledge, 2010).
Kruger, J. & Dunning, D. Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J. Pers. Soc. Psychol. 77, 1121–1134 (1999).
Sanchez, C. & Dunning, D. Intermediate science knowledge predicts overconfidence. Trends Cogn. Sci. 28, 284–285 (2024).
Lackner, S., Francisco, F., Mendonça, C., Mata, A. & Gonçalves-Sá, J. Intermediate levels of scientific knowledge are associated with overconfidence and negative attitudes towards science. Nat. Hum. Behav. 7, 1490–1501 (2023).
Light, N., Fernbach, P. M., Rabb, N., Geana, M. V. & Sloman, S. A. Knowledge overconfidence is associated with anti-consensus views on controversial scientific issues. Sci. Adv. 8, eabo0038 (2022).
Sanchez, C. & Dunning, D. Overconfidence among beginners: is a little learning a dangerous thing? J. Pers. Soc. Psychol. 114, 10–28 (2018).
Sanchez, C. & Dunning, D. Decision fluency and overconfidence among beginners. Decision 7, 225–237 (2020).
Hoogeveen, S. et al. The Einstein effect provides global evidence for scientific source credibility effects and the influence of religiosity. Nat. Hum. Behav. 6, 523–535 (2022).
Kominsky, J. F., Zamm, A. P. & Keil, F. C. Knowing when help is needed: a developing sense of causal complexity. Cogn. Sci. 42, 491–523 (2018).
Simis, M. J., Madden, H., Cacciatore, M. A. & Yeo, S. K. The lure of rationality: why does the deficit model persist in science communication? Public Underst. Sci. 25, 400–414 (2016).
Trench, B. in Communicating Science in Social Contexts: New Models, New Practices (eds Cheng, D. et al.) 119–135 (Springer, 2008).
Hendriks, F., Kienhues, D. & Bromme, R. in Trust and Communication in a Digitized World: Models and Concepts of Trust Research (ed. Blöbaum, B.) 143–159 (Springer, 2016).
Kaden, T., Jones, S., Catto, R. & Elsdon-Baker, F. Knowledge as explanandum: disentangling lay and professional perspectives on science and religion. Stud. Relig. 47, 500–521 (2018).
Scheufele, D. A. Science communication as political communication. Proc. Natl Acad. Sci. USA 111, 13585–13592 (2014).
Cruz, F. & Mata, A. Self-serving beliefs about science: science justifies my weaknesses (but not other people’s). Public Underst. Sci. 34, 172–187 (2025).
Ditto, P. H. & Lopez, D. F. Motivated skepticism: use of differential decision criteria for preferred and nonpreferred conclusions. J. Pers. Soc. Psychol. 63, 568–584 (1992).
Munro, G. D. & Ditto, P. H. Biased assimilation, attitude polarization, and affect in reactions to stereotype-relevant scientific information. Pers. Soc. Psychol. Bull. 23, 636–653 (1997).
Rutjens, B. T., Sutton, R. M. & van der Lee, R. Not all skepticism is equal: exploring the ideological antecedents of science acceptance and rejection. Pers. Soc. Psychol. Bull. 44, 384–405 (2018).
Scharrer, L., Rupieper, Y., Stadtler, M. & Bromme, R. When science becomes too easy: science popularization inclines laypeople to underrate their dependence on experts. Public Underst. Sci. 26, 1003–1018 (2017).
Sloman, S. A. & Rabb, N. Your understanding is my understanding: evidence for a community of knowledge. Psychol. Sci. 27, 1451–1460 (2016).
Fisher, M., Goddu, M. K. & Keil, F. C. Searching for explanations: how the Internet inflates estimates of internal knowledge. J. Exp. Psychol. Gen. 144, 674–687 (2015).
Rabb, N., Fernbach, P. M. & Sloman, S. A. Individual representation in a community of knowledge. Trends Cogn. Sci. 23, 891–902 (2019).
Messeri, L. & Crockett, M. J. Artificial intelligence and illusions of understanding in scientific research. Nature 627, 49–58 (2024).
Birhane, A., Kasirzadeh, A., Leslie, D. & Wachter, S. Science in the age of large language models. Nat. Rev. Phys. 5, 277–280 (2023).
Ostinelli, M., Bonezzi, A. & Lisjak, M. Unintended effects of algorithmic transparency: the mere prospect of an explanation can foster the illusion of understanding how an algorithm works. J. Consum. Psychol. 35, 203–219 (2025).
Dung, L. Current cases of AI misalignment and their implications for future risks. Synthese 202, 138 (2023).
Kidd, C. & Birhane, A. How AI can distort human beliefs. Science 380, 1222–1223 (2023).
Celiktutan, B., Cadario, R. & Morewedge, C. K. People see more of their biases in algorithms. Proc. Natl Acad. Sci. USA 121, e2317602121 (2024).
Erdfelder, E., Faul, F. & Buchner, A. GPower: a general power analysis program. Behav. Res. Methods Instrum. Comput. 28, 1–11 (1996).
Lakens, D. & Caldwell, A. R. Simulation-based power analysis for factorial analysis of variance designs. Adv. Methods Pract. Psychol. Sci. https://doi.org/10.1177/2515245920951503 (2021).
Cruz, F., & Lombrozo, T. How laypeople evaluate scientific explanations containing jargon. OSF https://osf.io/ytakw/?view_only=f3c34c42f79d4ecca2ab5502c35c0591 (2023).
Acknowledgements
We thank Fundação para a Ciência e Tecnologia (Doctoral Fellowship 2022.13009.BD) and Fulbright Portugal (Fulbright Research Fellowship 2023–2024) for their support in sponsoring F.C.’s visit to Princeton University, as well as the Concepts and Cognition Lab for feedback on subsets of this work. This work was not supported by any external funding, and the entities mentioned above had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript. A subset of these experiments was presented at the 51st Annual Meeting of the Society for Philosophy and Psychology and at the 46th Annual Meeting of the Cognitive Science Society.
Author information
Authors and Affiliations
Contributions
F.C. and T.L. contributed equally to this work. F.C. and T.L. conceptualized the studies and research designs, and developed the relevant stimuli. F.C. carried out experiment programming and data analysis, and wrote the original draft. T.L. provided funding and supervision. F.C. and T.L. read, reviewed and agreed to the published version of the Article.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Human Behaviour thanks Steven Sloman, Marc Stadtler and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary Information
Additional analyses, and Supplementary Tables 1–31 and Figs. 1–7.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Cruz, F., Lombrozo, T. How laypeople evaluate scientific explanations containing jargon. Nat Hum Behav 9, 2038–2053 (2025). https://doi.org/10.1038/s41562-025-02227-0
Received:
Accepted:
Published:
Version of record:
Issue date:
DOI: https://doi.org/10.1038/s41562-025-02227-0