Is there a faster way to improve critical thinking? A growing body of research indicates that the answer is yes, that we can improve critical thinking through, short targeted interventions.
Critical thinking is a well-known, yet nebulous term. We all tend to have our own subjective definitions of critical thinking. Various researchers also define critical thinking differently, which leads to many different ways to study it. Some researchers choose to evaluate critical thinking by judging one’s thought process, while others focus on logic and reasoning errors.
However, there is one aspect of critical thinking research on which many agree: developing critical thinking skills takes time. As some research suggests, a lifetime of experience can lead to marked improvements in reasoning. Researchers typically measure critical thinking in schools, and many intervention studies often occur within an educational context that generally spans months if not years. While most studies find improvement in critical thinking skills at the end of a school year, college course, or educational program, could improvements be made more quickly?
People must be explicitly taught how to transfer critical thinking skills from one domain to another. Real-world tasks and scenarios are most effective because they are authentic to the learning environment.
Recently, my team tried to answer this question, conducting a very brief intervention study via an online survey format that did indeed have a marked impact on critical thinking. The yet-to-be-published investigation examined three forms of reasoning across a variety of hypothetical scenarios before and after a text-based instructional intervention. We included different reasoning scenarios at pre- and post-test—a model for examining the change in participants’ thinking. The types of reasoning examined included sunk cost scenarios (e.g., continuing to watch a boring and bad movie solely because you already paid for it), counterfactual “if only” thinking (e.g., “If only I hadn’t hit the snooze button, I wouldn’t have been in this fender bender”), and probability estimation (i.e., understanding basic ratios).
Our intervention (i.e., experimental group) received an instructional block half-way through the survey about how to recognize potential sunk cost scenarios, what to expect to happen to one’s reasoning when sunk costs are present, and how to avoid the sunk cost fallacy through more logical goal-oriented behavior. The control group received a similarly-structured instructional block, but about cognitive dissonance theory. Both the experimental and control groups improved from pre-test to post-test, but the experimental group had greater improvement (more correct answers) for the sunk cost scenarios than the control group. The intervention appeared to help people reason more logically when sunk costs were present.
Our study suggests that some people seem ready to improve after being given an opportunity to reflect on their own reasoning skills, as evidenced by the individuals in the control group who improved from pre-test to post-test. But the study also demonstrates that there are important individual differences among people, some needing just an opportunity to reflect in order to improve critical thinking (which may indicate greater prior training or skill building opportunities), whereas others can improve with some overt, yet brief instruction. Still others likely require more intensive instruction to build lacking or faulty logical patterns (e.g., avoiding the traps of “fake news” and “alternative facts”). Note, however, the improvements did not transfer to the other types of reasoning.
Our research builds on other recent work. In one attempt to answer this question, Andrew Hafenbrack and colleagues performed an experiment a few years ago to improve one element of reasoning–the ability to resist sunk cost bias. In Hafenbrack’s intervention study, the researchers gave participants a 15-minute mindfulness intervention, then measured resistance to sunk costs. The control group engaged in mind wandering. In the end, the mindfulness group resisted sunk costs more so than the mind wandering controls. In short, it seems like critical thinking can be improved.
Our study suggests that some people seem ready to improve after being given an opportunity to reflect on their own reasoning skills.
A research summary from 2011 gives other evidence. Researcher Emily Lai’s literature review on critical thinking notes that is important to consider whether an intervention aims to improve the cognitive skills (e.g., planning, reasoning or problem solving) or the dispositions of the thinker (“open and fair-mindedness, inquisitiveness, flexibility, a propensity to seek reason, a desire to be well informed and a respect for and willingness to entertain diverse viewpoints,” per Lai’s literature review). Another major distinction is whether the intervention is general (e.g., all of reasoning) or domain-specific (e.g., only targeting sunk costs).
What’s key, according to Lai, is that people must be explicitly taught how to transfer critical thinking skills from one domain to another. Real-world tasks and scenarios are most effective because they are authentic to the learning environment and the learner can be at the center of the learning process. In this regard, our short intervention may have worked well because it was so explicit.
We all start learning critical thinking skills early in life, but we generally do not master these skills, likely because they can be so context-dependent. However, adults can and do learn to improve critical thinking, especially with good interventions that are overt and explicit.
The views in this article are not necessarily the views of the Reboot Foundation.
By Joseph McFall, a developmental psychologist and associate professor of psychology at the State University of New York at Fredonia.