Checklists used in the diagnostic reasoning process can be divided into two categories. Content-specific checklists provide clinicians with relevant knowledge during the diagnostic process or trigger them to activate their knowledge. For example, these may list specific diagnostic steps or suggestions of possible diagnoses that should be considered for a specific patient.
One example is a checklist to interpret electrocardiograms (ECGs) that included content-specific features, such as calculation of heart rate. Sibbald and colleagues found in several studies that the use of this checklist reduced diagnostic error based on interpretation of ECGs.7–9 Other studies also showed an effect of contentspecific checklists, but the effects are often small or apply only to a subgroup of clinicians.10,11
Process-focused checklists aim to trigger more deliberate thinking when diagnosing a patient. An example is a “debiasing” checklist that aims to reduce errors that occur due to shortcuts in the diagnostic reasoning process (i.e., cognitive biases).12 These checklists often contain items such as “what else can it be?”
A recent study by O’Sullivan and Shofield evaluated the use of a cognitive forcing mnemonic, called “SLOW”, on diagnostic accuracy. “SLOW” is an acronym for four questions: (1) “Sure about that? Why?” (2) “Look at the data, what is lacking? Does it all link together?” (3) “Opposite: what if the opposite is true?” and (4) Worst case scenario; “What else could this be?” A randomized trial found no effect of the SLOW intervention (compared with no intervention) on diagnostic accuracy based on clinical vignettes.13 Similarly, most studies that evaluated process-focused checklists found no significant effects on accuracy.14,15
Two studies have directly compared content-specific checklists and process-focused checklists. In a study by Sibbald and colleagues on ECG interpretation, the content-specific (knowledge-based) checklist as described before was compared with a process-focused (debiasing) checklist and a control group. The overall results did not show a significant improved performance on ECG interpretation with either checklist.14 This finding is in contrast to several earlier studies by Sibbald, et al., in which the content-specific checklist showed an effect.7,8
A study by Shimizu and colleagues compared medical students’ intuitive process (i.e., list the three most likely diagnoses) with one of two checklists: (1) a content-specific checklist that suggested differential diagnosis for the case at hand or (2) a process-focused checklist, i.e., a general debiasing checklist developed by Ely, et al.,5 with checklist items such as “Did I obtain a complete medical history?” and taking a “diagnostic time out.”
The authors exposed the participants to both simple and difficult clinical case vignettes based on actual patient experiences. Overall, they found that the use of a checklist did not improve accuracy in the easy cases; on the contrary, diagnostic accuracy was reduced by the use of checklists. For difficult cases, the content-specific checklist improved diagnostic accuracy, but the debiasing checklist was not effective in either simple or difficult cases.16
Taking all this research into account, content-specific checklists seem to be more promising than processfocused checklists, but the evidence is relatively thin with few studies.