Competency or Passing the Boards? Every patient wants an expert.

Moving Towards Modern Medical Education and Training — Part 4

In this installment in the series we will review the current state of post graduate medical education and training, and the related methods of competency evaluation. The focus will be on three key components of our education and training model and, in particular, how they might affect our unwritten social contract (1) to prove to our patients that our approach to education creates a safe and effective system of professional behavior and competency on their behalf.

Those three elements include:

1. Curriculum development
2. The testing rubrics that are proxies for establishing competency
3. What we define today as “competency”

How is expertise acquired and how do we measure it?

With regard to the last of these three topics, we must understand the complex relationship between mastery, competency and what parameters constitute an acceptable rate of error in those individuals considered to have become competent/expert in the current system. We must also admit, freely, that it is unreasonable to believe that any single individual can perform at an expert level in all aspects of diagnostic imaging.

The “Competencies”, “Milestones” and “Envelope of Expectations” are essentially progressively complex ACGME mission statements.

The Accreditation Council for Graduate Medical Education (ACGME) in their “Envelope of Expectations” indicates a trainee in Diagnostic Radiology should reach a level of “Competent” late in the 2nd year to beginning of the 3rd year (Figure 1).

Figure 1. Expectation of level of competency achieved by diagnostic radiology residents at each phase of training and practice.

Beyond that ACGME Milestone-related, specified goal, there is a progressive expectation by the end of the 4th year that the graduating resident become “Proficient” (Figure 1).

It might be difficult for a patient to understand the difference between “Competent” and “Proficient” since neither terms are objectively defined or established by the current curriculum requirements or sufficient evaluation rubrics, either during or following, the completion of training; so where is the real measure of these goals?

Beyond those “Expectations” plateaus, there is an aspiration expressed, in this milestone surrogate ACGME Envelope of Expectations, that the practicing radiologist attain an “Expert” level of competency sometime shortly after graduation; that expert status to be assured by a process of Maintenance Of Certification (MOC) (1). What is the guarantee to the patient that the curriculum and evaluation methodologies in the MOC process somehow creates or assures “Expert” professional behavior?

Truthfully, there is no such guarantee.

Establishing expert level behavior/competency is a process that requires intense self-discipline and study, referred to by some as Deliberate Practice.(2–4) This Deliberate Practice discipline must initially persist for at least several years beyond the training, preferably in an environment that provides feedback and accountability for progress in order to become a true expert. So this aspiration of attaining expert level professional results really becomes a matter of personal responsibility rather than one imposed and measured by an external authority.

Realistically, our patients want to believe that we are truly expert as we engage in medical decision-making on their behalf. In order to fulfill this expectation, we must have a never-ending curriculum and method of self-evaluation and self-discovery that inspires and requires Deliberate Practice that will make true expertise possible with sufficient effort.

Deliberate Practice is what makes the difference between someone who plays an instrument occasionally and produces, perhaps, pleasant music and someone who practices enough to be part of an orchestra, or preferably, first chair and/or a concert soloist.(2–4) For those who cannot, or are satisfied not to become expert, we owe our patients an accessible expert reasoning and problem solving support mechanism as Dr. Weed suggested in the 1960’s.(5–9) For those who aspire to routine expert level performance we owe an educational system that can help them to develop expert competency and confirm it has been attained and maintained and, in fact, grows with time.
The terms adopted by the ACGME to describe the “levels” of professional development in their “Envelope of Expectations” are not defined (Figure 1). As a result, the “Envelope of Expectations” is open to a good deal of subjective interpretation of what constitutes reasonable progression of professional growth during training. The related ACGME Milestones do not constitute a curriculum lesson plan; they are more of a framework, conceptual plan. There are no specific, progressive, evaluation rubrics to monitor the mastery of specific competencies during training.

The current system enforces and certifies progression in learning, based rather simply on subjective evaluations that mark each year of training. The only specified, objective evaluation rubric of competency within the “Envelope of Expectations” is the ABR core exam. However, the core exam does not measure “real life” professional radiology skill set competency. Therefore, the “Competencies”, “Milestones” and “Envelope of Expectations” are essentially progressively complex ACGME mission statements.

As such, these policy statements do not specifically inform the teaching methodology/theory that is necessary to choose and exploit effective adult learning tools that are appropriate to each level of necessary professional growth in training in the ACGME “Envelope”.

Eventually an effective curriculum must set the stage for lifelong “Deliberate Practice” to make us all experts for our patients. Our patients believe we are experts. Expert level competence is certainly what they desire in those who engage in medical decision-making on their behalf. Expert professional behavior is what we should aspire to and achieve to deserve their trust in our professionalism.

So, as we become more informed as educators, there is another layer of understanding and curriculum development that we owe our trainees and patients in pursuit of this necessary aspirational goal of eventual expert level competence.

The central question in pursuit of a competency based curriculum becomes: What can we do to align, well known and well documented adult learning principles with the level of desired professional educational development to create an accountable system of true competency based training and evaluation?

We must to take on this task. To do this we must seriously consider the state of the trainee as they move through the reasonable levels of training and experience represented in the ACGME “Envelope of Expectations”.
We can begin that exploration by endeavoring to understand what actually constitutes competency. Dreyfus and Dreyfus provide us with a nomenclature for, and definition of, the levels of competence that can lead us to expert level development.10 These levels match up pretty well with the “stages” in the ACGME Envelope of Expectation. The Dreyfus model defines these levels as situations. This situational awareness can then be matched with parameters with an appropriate application of professional experiential and other adult learning theory.11–15

The five Dreyfus levels are:

1. Novice: Rule-based behavior, strongly limited and inflexible
2. Experienced Beginner: Incorporates aspects of the situation
3. Practitioner: Acting consciously from long-term goals and plans
4. Knowledgeable practitioner: Sees the situation as a whole and acts from personal conviction
5. Expert: Has an intuitive understanding of the situation and zooms in on the central aspects

As we consider the following arguments and data to be presented in this essay it may become apparent that what is missing in our diagnostic radiology post graduate educational system is a proper structured beginning point at the Novice level. The specific question being: Do we teach the fundamentals of observation and data gathering from images in a manner that is reproducible and reliable?

All else interpretive flows from that fundamental skill. Failing that the remainder of the educational experience will produce sub-optimal results.

In the next installment we will review the historic error rates in positive studies for radiology trainees and the resulting “education opportunity gaps” they create. We will also explore how the “education opportunity gaps” present a challenge to acheiving expert performance.

I am on a mission to modernize post graduate medical education. With my team at the University of Florida, we have spent the last eight years developing a competency based curriculum and evaluation for radiology, based on modern learning theory. In this essay series, Moving Towards Modern Medical Education and Training, I examine in detail the pathway to modern learning and educational theory, and the outcome of the application of modern learning principles in this sphere of medical education.

Part 1: Medical Education: How Did we Arrive at the Current State

Part 2: “See one do one teach one”

Part 3: Teaching to the Test

Coming next:
Part 5 — The challenge to achieving expert performance: reducing the historic 30% error rate in positive studies.


1-Starr, Paul (1982). The Social Transformation of American Medicine. Basic Books. pp. 514 pages. ISBN 0–465–07935–0.
2-Ericsson, K. A. (2004). Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Academic Medicine, 79(10 Suppl), S70–S81.Physicians’ learning in practice 93
 3-Ericsson, K. A. (2006). The Influence of experience and deliberate practice on the development of superior expert performance. In K. A. Ericsson, N. Charness, P. J. Feltovich, & R. R. Hoffman (Eds.), The Cambridge handbook of expertise and expert performance (pp. 683–704). New York, NY: Cambridge University Press.
 4-Ericsson, K. A., Krampe, R. T., & Tesch-Ro¨mer, C. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100(3), 363–406.
 5-Weed, L. L. (1964–06–01). “MEDICAL RECORDS, PATIENT CARE, AND MEDICAL EDUCATION”. Irish Journal of Medical Science. 462: 271–282.
 6-Weed, L. L. (1968–03–14). “Medical records that guide and teach”. The New England Journal of Medicine. 278 (11): 593–600. doi:10.1056/NEJM196803142781105. ISSN 0028–4793. PMID 5637758.
7-Weed, L. L. (1968–03–21). “Medical records that guide and teach”. The New England Journal of Medicine. 278 (12): 652–657 concl. doi:10.1056/NEJM196803212781204. ISSN 0028–4793. PMID 5637250.
8-Weed LL. Medical records, medical education, and patient care: the Problem-Oriented Medical Record as a basic tool. 1970. Cleveland (OH): Press of Case Western Reserve University.
9-Jacobs L. Interview with Lawrence Weed, MD — the father of the problem-oriented medical record looks ahead [editorial]. Perm J 2009 Summer; 13(3):84–9.
10-Stuart E. Dreyfus The Five-Stage Model of Adult Skill Acquisition
Bulletin of Science Technology & Society 2004 24: 177. The online version of this article can be found at
 11-Knowles, M. (1984). The Adult Learner: A Neglected Species (3rd Ed.). Houston, TX: Gulf Publishing.
12-Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development (Vol. 1). Englewood Cliffs, NJ: Prentice-Hall.
13-Bloom, B., Englehart, M. Furst, E., Hill, W., & Krathwohl, D. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. New York, Toronto: Longmans, Green.
14-Krathwohl, D. R. Methods of Educational & Social Science Research: An Integrated Approach. 1st Ed. 1993, 2nd Ed. 1998, New York: Longman, also Long Grove, IL: Waveland Press; 3rd Ed 2009, Waveland Press
 15- Understanding by Design® book (Wiggins & McTighe, 1998

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.