Mastering All Clinical Scenarios in Radiology: An Invitation to Harm?
Moving Towards Modern Medical Education and Training — Part 7
Significant trainee knowledge gaps exist in the late stages and at the conclusion of the 4th year in radiology training programs. These knowledge gaps are now very well documented (1), at least at the stage where trainees are evaluated before advancing to the Entrustable Professional Activity (EPA), which includes a progression to distance/remote supervision. Very likely the gaps persist even after the EPA phase of training.
Many trainees elect fellowships to at least partially close the gaps. Still, many of our resident and fellows become certified specialists with significant existing gaps in their knowledge; these gaps constitute a potential substrate for harm. This situation is a limitation more of the educational system than of the individuals. There is simply too much material to master.
As discussed in Part 2, the writings of Dr. Lawrence Weed foreshadowed this dilemma (2–5). Simply put, his writings correctly posited that there is too much information for all of these highly intelligent and motivated physicians to use efficiently without some directional, organizational framework.
Limiting the scope of licensure is not likely to come about any time soon. Limiting the scope of certification of specialty, however, is an articulated goal of the current movement in Competency Based Medical Education (CBME) at the AMA (6).
Training and Evaluating Competency
Radiology must follow through on narrowing the scope of the general radiologist designation, following the lead of those in the ACGME suggesting that training programs direct a resident to no more than three areas of potential subspecialty mastery (7).
Vascular Interventional Radiology (VIR) training is wisely on such a path, and this could lead to better-qualified practitioners, if they follow through on related CBME curriculum development and meaningful certification evaluation that assures proficiency/competency at the end of the training period. But just the opposite from the forward-looking VIR promise has occurred in initial, marginal attempts to codify added training and related proof of competence in other domains of diagnostic imaging.
As an example, the Certificate of Added Qualifications (CAQ) concept, has harmed the goal of demonstrating true competency in Neuroradiology. If you doubt that assertion, have a conversation, probably in private, with neuroradiology subspecialists who truly understand what it takes to become an effective consultant to our colleagues in neurosurgery, neurology, and otolaryngology.
Before the CAQ in Neuroradiology, attestation to peer competence required two years of fellowship training in a first-rate program, and peer recognition by admission to the American Society of Neuroradiology (ASNR) as a senior member. Now one year of training and an examination made up of a few multiple-choice questions featuring 3 or 4 images substitute as adequate training and competency evaluation to identify an individual as a certified neuroradiology subspecialist.
Moreover, the related Maintenance of Certification (MOC) process that was supposed to help provide a path to lifelong progress from proficiency/competency to expert has changed. The original ten-year oral examination cycle was modified to a ten-year written examination cycle for recertification and is now relegated to a system where cases are sent out on a regular basis, with a limited number of images and multiple-choice questions related to the case. This has no relation to an effective, competency-based rubric to show real mastery of a logical and complete curriculum over time.
When you are taking care of patients you get a full set of DICOM images at a computer workstation and no multiple-choice questions, but hopefully an electronic medical record that allows the evaluation of the context of the case or as a replacement the opportunity to speak directly with the referring provider. The current CAQ/MOC concept does not come close to replicating the reality of engaging in true clinical decision-making.
This approach to proof of even a limited domain of practice, cannot create a high level, expert neuroradiologist consultant or assure ongoing proficiency in neuroradiology. It remains in the mode of the trade/apprenticeship model established so long ago, and modified though not optimized over the ensuing century. There continues to be no defined competency based curriculum. The one year of required training is arbitrary and feeds this haphazard, apprenticeship style educational model.
The period of training should be as long as it takes to prove mastery in a well-defined skill set.
Any MOC structure needs to be one with an effective evaluation rubric. Such evaluation of mastery/competency needs to encourage continuous commitment to acquiring skill sets in a way that realistically reproduces the practice of consultation and reporting by a neuroradiologist. Such a process will then encourage the lifelong progress from proficiency/competency to expert in a manner that advances medical decision making and best serves our patients.
How many one-year graduates of neuroradiology training programs that lack a tangible curriculum and reasonable competency measurement rubric will aspire to or have a sufficient foundation to achieve that goal?
Our VIR colleagues have recognized that acquiring expertise takes more time and appropriate training. We will see if they are up to identifying a complete curriculum in a competency based model, and an effective method to test for the competencies that constitute their subspecialty. They are making a sincerely good start.
We Can Do Better
We can do a better job with medical education. We can look to extending training periods to the extent that they allow mastery of a defined curriculum. Such a system can continue after initial training, but the curriculum required needs to be codified and true mastery needs to be proven by effective systems of evaluation that replicate real clinical practice.
What else might we do? For practitioners seeking to move from proficiency/competency to expert, support systems available with modern IT tools can aid in decision-making at an expert level before an expert level of professional training is certified. Such tools would also be useful for those physicians who prefer to remain at a proficiency/competent level but encounter clinical circumstances that require more expert thought and input.
Such tools would be equivalent to the very insightful suggestions by Dr. Lawrence Weed with regard to presenting the competent medical generalist with an IT-based decision support system that can assist and help direct logical thought processes (2–6). That system suggested by Dr. Weed, based on his theory of knowledge couplers, is capable of directing medical decision-making thought processes along likely productive paths while ignoring nonproductive lines of critical clinical reasoning. Generalist radiologists (whether choosing a single or multiple subspecialties) at any level, must similarly eliminate nonproductive lines of clinical pursuit.
Decision support diagnostic imaging IT systems could clearly aid in triage of probable scenarios, and moving the care process forward most efficiently.
The reluctance to invest heavily in this infrastructure the same way Mr. Gates put $12,000,000 into the Khan Academy is partially what retards this development. The Khan Academy experiment in leveraging IT to facilitate modern learning techniques has done so much for our youth, especially the underprivileged. It is truly a heartwarming story to see how even early implementation of these IT tools, flipping the classroom and the development of heat maps that trace individual progress in a specific, articulated curriculum, has changed the life of secondary and primary students in the northern California school systems.
I am sure there have been many similar success stories related to this remarkable move forward in educational practice. It remains astonishing to me that with the amount of wealth that is concentrated in medicine, we seem unwilling to commit adequate resources to orderly curriculum development, as well as truly effective evaluation rubrics both at the graduate and postgraduate levels of medical education.
Can we, in medicine, not really keep up with the more enlightened educators in our public school systems?
What must come first is an admission of our current limitations. We then need to commit to a rational and coordinated approach to remake our educational model in a construct that is up to the complexity of what is asked of us. We need to do, and can do much better at improving patient care and limiting the potential for harm.
Coming Next: Part 8 — Three Ways Medical Education Could Better Serve The Modern Learner: Knowledge is a Commodity, Can We Transfer Wisdom as Readily?
I am on a mission to modernize post graduate medical education. With my team at the University of Florida, we have spent the last eight years developing a competency based curriculum and evaluation for radiology, based on modern learning theory. In this essay series, Moving Towards Modern Medical Education and Training, I examine in detail the pathway to modern learning and educational theory, and the outcome of the application of modern learning principles in this sphere of medical education.
1- UF Simulation Data — Multiple presentations RSNA and ASER 2016
5-Weed LL. Medical records, medical education, and patient care: the Problem-Oriented Medical Record as a basic tool. 1970. Cleveland (OH): Press of Case Western Reserve University.
6-AMA.org- Education-Creating the Modern Medical School.
7- ACGME 2012- “Envelope of expectations”