Current issues of ACP Journal Club are published in Annals of Internal Medicine


Editorial

Does clinical experience make up for failure to keep up to date?

PDF

ACP J Club. 2005 May-Jun;142:A8. doi:10.7326/ACPJC-2005-142-3-A08



You have just moved to a new town to open an internal medicine practice, and must also find a doctor for your own young family. 2 doctors are accepting new patients:

Jane completed residency 2 years ago, and scored in the top 2% of her class on the certification exams.

Susan completed residency 10 years ago, and scored in the top 25% of her class on the certification exams.

Who do you choose?

When this question has been posed to large clinical audiences over the past few years, only half a dozen people ever choose Jane over Susan. Why? Most people would accept that not only did Susan start off worse in formal knowledge, but also that it is very likely that her knowledge and application of current recommended care is likely to have fallen off further, as Choudhry and colleagues have shown in a recent systematic review (1). Why, then, do audiences so unanimously choose the more experienced practitioner?

Perhaps for just that reason: experience. The practice of medicine, like many other areas of human endeavor, requires considerable “hands-on” experience to achieve mastery. Most physicians, when asked, indicate that they did not really feel competent for several years after they entered practice. This tallies with estimates from other domains, which suggest that 10 years or 10 000 hours are required to become a virtuoso (2). But what is gained from experience?

Unfortunately, if you believe the conclusions of Choudhry and colleagues, the answer is that nothing is gained and much is lost. They claim to have identified several studies where increasing years in practice is associated with increased mortality. But on closer scrutiny, the differences, when present, are small. The paper states that, in the best study of outcome (3), every year since graduation resulted in an increase of 0.5% mortality in the management of postmyocardial infarction (MI) patients. However, this was an increase in relative risk so that each year of practice equated to an absolute increase of mortality of only about 0.05%, on a baseline mortality post-MI rate of about 10%. After 20 years of practice, the mortality rate would be projected to rise to 12%. A second study by the same authors (4) showed a nonsignificant increase corresponding to about 0.02% mortality per year, or an absolute increase in risk of 0.4% after 20 years. Since treatment of MI is an area of medicine in which swift advances in treatment are the norm, this figure might well be regarded as an upper limit. Indeed, closer examination of the other studies of mortality cited by Choudhry and colleagues show small or absent effects.

These findings reveal an intriguing paradox. Physicians in practice tend not to keep up on either knowledge tests or adherence to practice guidelines, but this seems to have minimal impact, if any, on patient outcomes. It could, of course, be the case that practice guidelines do not make that much difference in patient outcomes, and indeed few studies show a benefit of guidelines (5). However, it is not the case that outcomes are simply insensitive to provider differences, because, in the studies by Norcini and colleagues (3, 4), subspecialization resulted in absolute mortality differences of about 2.5% and success on a certification examination resulted in an absolute mortality difference of about 2%. So we are left with the conclusion that outcome measures do not seem to decline with age commensurate with the drop in knowledge or guideline compliance. Does that mean that something is acquired from experience that compensates for a fairly large decline in knowledge and therapeutic approach?

Some indication of this compensatory mechanism can be gleaned from studies of expertise in other domains. The most studied area of expertise is chess, which has been systematically explored for over 50 years. It has been shown that chess expertise is due, in large part, to thousands of hours of deliberate practice over 10 to 20 years (2). The consequence is that chess performance shows a curvilinear relation to age, peaking at about age 35 while expertise is slowly acquired, and declining slowly thereafter.

Expert chess play consists, in large part, of matching the current play to some learned moves in memory—what one might call pattern recognition. Not surprisingly, since this is the essential nature of the skill, chess expertise is best observed in speeded play (6), where the player must rapidly select the best move and the ability to recognize patterns of play is a decided asset. Studies in other areas of expertise have also shown that experts actually do better under speeded conditions; expert golfers have more accurate putts when told to be rapid than when told to be accurate (7).

Some similar observations exist in medicine. Expert dermatologists make a correct diagnosis in an average of 8 seconds; when they are wrong, it takes them about 12 seconds; and when they are unsure, they will ponder for 28 seconds on average (8). Hobus and colleagues (9) have shown that when clinicians are provided with minimal information, the correlation between diagnostic accuracy and experience is 0.68.

Although these findings suggest that experience enables practitioners to make decisions rapidly, it remains unclear how this skill relates to experience. We have pursued a line of inquiry based on the assertion that expert clinicians frequently arrive at a diagnosis by mentally comparing the presenting situation to a specific previous case (10-12). The process occurs without conscious reflection, analogous to the way we recognize a friend on the street (12). Thus, it is reasonable to presume that one major component of medical expertise that is learned during the course of many years of practice is the accumulation of a vast mental storehouse of clinical cases, on which experts draw repeatedly to arrive at a diagnosis.

Cast in this light, the real perplexity in the review by Choudhry and colleagues is the lack of a positive relation between experience and clinical outcomes. Although the clinician becomes more and more accomplished at diagnosis based on pattern-recognition processes with increasing experience, it may be the case that the benefits of the strategy come at the cost of reduced flexibility. Hashem and colleagues (13) have presented data showing that specialists have a tendency to cognitively “pull” cases toward the domains with which they have most experience.

Confirmation of this possibility comes from studies of the Physician Review and Enhancement Program (PREP) in Ontario, Canada, which used a battery of tests of physician performance. Overall they too found a negative association between age and expertise (14). Systematic consideration of the causes of poor performance in the older physicians, however, suggests that premature closure (i.e., excessive reliance on one's early impressions of a case) is a major source of difficulty (15). In other words, more experienced physicians seem more likely to accurately diagnose using pattern recognition, but as a result of increased reliance on this strategy, they are also less likely to give due consideration to competing diagnoses (16).

The discussion above relates primarily to diagnostic expertise. But by and large, the performance measures in the review by Choudhry and colleagues reflect either surgical skill or management strategies, in circumstances where the diagnosis is a given. However, a recent study sheds some light on the relation between experience and management. Schuwirth and colleagues (17) assessed rheumatologists with 2 kinds of clinical problems: A computer-based test consisting of 55 written management cases focusing on 1 to 4 essential decisions, and a series of 8 incognito standardized patients who visited their offices and completed a performance checklist after the encounter. The standardized patient test had a strong negative correlation (r = −0.50) with the total number of patients seen during the rheumatologist's professional life, again presumably because experts do not require as much information as novices, so their expertise is impaired by use of checklist scoring systems (18). In contrast, the computer test was positively correlated (r = 0.58) with lifetime experience. Why the discrepancy with Choudhry and colleagues' findings? Perhaps because responses were subjectively scored by other experts on a “case-by-case” basis, rather than compared with a detailed set of items like a practice guideline. Perversely, it may just be that expertise is evidenced as much in knowing when to depart from guidelines as knowing what the guidelines say. Indeed, a recent study (19) of hospital clinicians indicated that consultants' approaches to drug therapy were more idiosyncratic than those of house officers, mainly because the consultants were more holistic and adapted the prescribing to the individual patient, whereas junior doctors used a more formulaic approach.

In summary, we have suggested 2 mechanisms to explain the paradox that apparently large declines in measures of knowledge and process of care do not translate into commensurate large differences in patient outcome. First, adherence to prescribed practices of care may be, in some sense, optimal at a population level, but at an individual level, experienced physicians may deliberately and systematically depart from these guidelines to recognize individual patient needs. As a consequence, they may be penalized on measures based on adherence to prescribed regimens. Second, there is an accumulation of evidence showing that experienced physicians rely more on pattern recognition strategies that can, to some degree, compensate for failure to keep up in formal knowledge, but can themselves lead to negative consequences (20).

We are not suggesting that the findings of Choudhry and colleagues should be lightly dismissed. They do indicate that physicians are not keeping up with current approaches to patient care. Approaches to maintenance of competence that are dependent on self-assessment of one's own knowledge/abilities should be critically reexamined. It seems unlikely that admonitions to be more reflective or to identify weaknesses can overcome the negative trends identified.

Viewing experience as a double-edged sword, as we have, creates the opportunity for more effective continuing education. Lectures and distribution of printed materials are not effective (21). Learning around specific cases in which individuals are challenged to apply the latest research evidence might be (22). Doing so in a context in which participants are required to respond to feedback is likely to provide further incremental benefit, especially if that feedback is derived from individuals with heterogeneous backgrounds and varied levels of expertise. In general, the position presented in this editorial leads us to advocate recognizing the unique strengths that experience provides while simultaneously developing and investigating continuing education strategies that reignite the analytic tendencies of individuals for whom medical practice has become excessively automated and routinized.

Geoffrey R. Norman, PhD
Kevin W. Eva, PhD
McMaster University
Hamilton, Ontario, Canada


References

1. Choudhry NK, Fletcher RH, Soumerai SB. Systematic review: the relationship between clinical experience and quality of health care. Ann Intern Med. 2005;142:260-73. [PubMed ID: 15710959]

2. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79:S70-81. [PubMed ID: 15383395]

3. Norcini JJ, Kimball HR, Lipner RS. Certification and specialization: do they matter in the outcome of acute myocardial infarction? Acad Med. 2000;75:1193-8. [PubMed ID: 11112721]

4. Norcini JJ, Lipner RS, Kimball HR. Certifying examination performance and patient outcomes following acute myocardial infarction. Med Educ. 2002;36:853-9. [PubMed ID: 12354248]

5. Straus SE, Jones G. What has evidence based medicine done for us? BMJ. 2004;329:987-8. [PubMed ID: 15514317]

6. Burns BD. The effects of speed on skilled chess performance. Psychol Sci. 2004;15:442-7. [PubMed ID: 15200627]

7. Beilock SL, Bertenthal BI, McCoy AM, Carr TH. Haste does not always make waste: expertise, direction of attention, and speed versus accuracy in performing sensorimotor skills. Psychon Bull Rev. 2004;11:373-9. [PubMed ID: 15260208]

8. Norman GR, Rosenthal D, Brooks LR, Allen SW, Muzzin LJ. The development of expertise in dermatology. Arch Dermatol. 1989;125:1063-8. [PubMed ID: 2757402]

9. Hobus PP, Schmidt HG, Boshuizen HP, Patel VL. Contextual factors in the activation of first diagnostic hypotheses: expert-novice differences. Med Educ. 1987;21:471-6. [PubMed ID: 3696019]

10. Allen SW, Norman GR, Brooks LR. Experimental studies of learning dermatologic diagnosis: the impact of examples. Teach Learn Med. 1991;4:35-44.

11. Kulatunga-Moruzi C, Brooks LR, Norman GR. Coordination of analytic and similarity-based processing strategies and expertise in dermatological diagnosis. Teach Learn Med. 2001;13:110-6. [PubMed ID: 11302031]

12. Hatala R, Norman GR, Brooks LR. Influence of a single example upon subsequent electrocardiogram interpretation. Teach Learn Med. 1999;11:110-7.

13. Hashem A, Chi MT, Friedman CP. Medical errors as a result of specialization. J Biomed Inform. 2003;36:61-9. [PubMed ID: 14552847]

14. Norman GR, Davis DA, Lamb S, et al. Competency assessment of primary care physicians as part of a peer review program. JAMA. 1993;270:1046-51. [PubMed ID: 8350446]

15. Caulford PG, Lamb SB, Kaigas TB, et al. Physician incompetence: specific problems and predictors. Acad Med. 1994;69:S16-8. [PubMed ID: 7916814]

16. Eva KW. The aging physician: changes in cognitive processing and their impact on medical practice. Acad Med. 2002;77:S1-6. [PubMed ID: 12377689]

17. Schuwirth L, Gorter S, van de Heijde D, et al. The role of a computerized case-based testing procedure in practice performance assessment. Adv Health Sci Educ Theory Pract. 2005 (in press).

18. Hodges B, Regehr G, McNaughton N, Tiberius R, Hanson M. OSCE checklists do not capture increasing levels of expertise. Acad Med. 1999;74:1129-34. [PubMed ID: 10536636]

19. Higgins MP, Tully MP. Hospital doctors and their schemas about appropriate prescribing. Med Educ. 2005;39:184-93. [PubMed ID: 15679686]

20. Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39:98-106. [PubMed ID: 15612906]

21. Grimshaw JM, Shirran L, Thomas R, et al. Changing provider behavior: an overview of systematic reviews of interventions. Med Care. 2001;39:II2-45. [PubMed ID: 11583120]

22. Herbert CP, Wright JM, Maclure M, et al. Better Prescribing Project: a randomized controlled trial of the impact of case-based educational modules and personal prescribing feedback on prescribing for hypertension in primary care. Fam Pract. 2004;21:575-81. [PubMed ID: 15367481]