RTI uses cookies to offer you the best experience online. By clicking “accept” on this website, you opt in and you agree to the use of cookies. If you would like to know more about how RTI uses cookies and how to manage them please view our Privacy Policy here. You can “opt out” or change your mind by visiting: http://optout.aboutads.info/. Click “accept” to agree.
The accuracy of medical diagnosis is often degraded when diagnosis deviates from the principles of normative decision making. Although diagnostic errors can potentially be reduced by metacognitive training, that proposal (put forth by Croskerry in an accompanying article) needs validation and extensive exploration.
In his article in this issue of Academic Medicine, Croskerry is correct to emphasize how frequently medical diagnosis departs from the pristine boundaries of normative decision making.1 Many, or perhaps even most, medical diagnoses arise instead from Reason's flesh and blood processes that rely heavily on subconscious framing of the problem, extensive use of simplifications, and diagnostic assignments based on heuristic thinking with its inherent biases. Croskerry's recent compilation of these cognitive dispositions to respond2 (see List 1 of his article in this issue for an abbreviated version) is an excellent starting place to examine the many ways in which the process of arriving at a diagnosis deviates from normative techniques.
Croskerry goes on to propose that diagnostic accuracy could be improved if we can debias the clinician by metacognitive training. My colleagues and I wholeheartedly agree,3 but we wonder whether this can be accomplished. Croskerry cites examples of successful debiasing experiments and holds up the existence of medical experts as proof that debiasing can succeed. There are a few examples of training exercises that seem to succeed in debiasing the learners. However, these have typically been classroom demonstrations, with testing done in close-time proximity to the actual training. It is an unproven assumption that success in improving the accuracy of decision making in a laboratory setting, or in a field outside medicine, guarantees the same results in the sometimes unique world of medical decision making. Finally, according to the current paradigms, the expert becomes so from an overwhelming mastery of content-specific knowledge, not from training in the art of metacognition.
We agree with Croskerry that, in theory, metacognitive training will inevitably improve diagnostic accuracy, but the general pessimism regarding this approach exists for good reason. Metacognitive training cannot be presented as a proven success story-this technique needs to be validated through research and application. Can we train physicians to employ metacognition during the decision-making process? Can we actually show that diagnostic accuracy improves? What techniques work the best? During which stage of medical education should these skills be taught? Is there a downside if we teach clinicians to doubt their first impressions?
The challenge for those of us who believe in the potential of metacognitive training to improve diagnostic accuracy is to begin addressing these questions and to validate this approach in the real world of day-to-day medical care.