Uncategorized

Milestones or Millstones?

GundermanGood intentions do not necessarily lead to good results.  A case in point is the milestones initiative of the Accreditation Council of Graduate Medical Education and its various medical specialty boards, which are working together in an attempt to improve the quality of graduate medical education.  In practice, however, the milestones are often not proving to be a valuable indicator of learner progress and are in fact acting like millstones around the necks of trainees and program directors.

The goals behind the milestones initiative are laudable.  Introduced as part of the Next Accreditation System (NAS), they were intended to shift attention of learners and educators from processes to outcomes.  They would foster self-directed learning and assessment and provide more helpful feedback.  In theory, programs that were doing well would face less burdensome oversight and under-performing ones would receive more prompt and helpful guidance.

In practice, however, the milestones initiative has reminded many program directors and trainees of the onerous impact of maintenance of certification programs enacted by the American Board of Medical Specialties.  Simply put, when the lofty rhetoric of initial assurances is set aside, the risks and costs of such initiatives appear to many to exceed the benefits by an unacceptably high margin.  In many cases, this can be traced to a failure to assess outcomes before implementing system-wide change.

Graduate medical education has been changing at a fast and furious pace of late, and many program directors feel as though they are being buried by one avalanche after another of new requirements.  Such rapid and sweeping changes would seem to suggest that the existing system was badly broken.  To many educators and trainees, however, the system was functioning reasonably well, producing well-trained, competent, and safe physicians.

The assumption seems to be that program directors, faculty members, and trainees cannot be relied on to do a good job of monitoring educational progress, providing encouragement, requiring improvement where change is needed, and preserving the educational integrity of their programs and people.  In our experience, however, the greatness of educational programs often lies less in systems than in people.

These days when a program director meets with a trainee, they often spend nearly all of their time ticking off boxes, leaving little opportunity to get to know one another and identify problems and opportunities that deserve attention.  Initiatives such as the milestones have an unfortunate tendency to treat all learners and programs more or less the same, fostering homogenization and stunting the development of distinctive interests and abilities.

It is a mistake to suppose that educational quality is directly proportional to the number or difficulty of standardized hoops that learners, educators, and program directors must jump through.  To the contrary, in many cases adding hoops only draws time and energy away from more important pursuits that would ultimately make a bigger difference for the profession of medicine and the patients it serves, now and in the future. 

Consider a similar sea change in graduate medical education from a decade and a half ago: the imposition of duty hour limits.  It makes sense that error rates would go up when trainees are insufficiently rested.  However, the duty hours limits fail to ensure that trainees get more rest, and there is mounting evidence that these requirements are doing at least as much harm as good in terms of error rates and overall educational quality.

Before changes such as duty hours limits and the milestones are implemented, accrediting organizations should make sure they have done their homework.  Results that seem entirely predictable on paper or from the vantage point of a boardroom may not necessarily prove the rule in practice.  There is too much at stake to implement such sweeping changes without first ascertaining what actually happens in training programs and health centers where they are implemented.

Why are such far-reaching changes in graduate medical education being promulgated with increasing frequency?  One factor is probably the desire of each accrediting agency’s new board members and staff to make an impact, based on the presumption that if they are not doing something new and big, they are not doing a good job.  In fact, however, in regulating medicine as in regulating offspring, a move away from micromanagement is often just what the doctor ordered.

To repeat, there is no evidence that increasing systematization, standardization, or documentation will necessarily improve educational outcomes, any more than requiring parents to produce mounds of paperwork will necessarily enhance childrearing.  The time has come to stop treating learners, faculty members, and program directors as if they are incompetent and untrustworthy, and instead recognize they know their education needs better than anyone else.

Too often, well-intentioned plans hatched in big cities such as Philadelphia, Chicago, and Washington, DC, are simply too far removed from what is actually happening on the front lines of education to produce a favorable ratio of benefits over harms.  Far from stimulating programs to better performance, we are too often weighing them down.  For many program directors who already feel as though they are barely keeping their heads above water, the result is drowning.

Richard Gunderman, MD and Darel Heitkamp, MD are faculty at the University of Indiana.

1 reply »

  1. Well said. In the spirit of the ABMS/MOC racket, the ACGME has developed a colossal waste of time and effort which directly erodes the opportunities for real medical training for residents. if they were aware of any real “outcomes”, they would be aware that their process driven pet projects have the cumulative effect of distracting educators and trainees. The desk surfing class has to do something to justify their existence off the backs of those doing the heavy lifting, so relentless churning of metrics has become the new honey pot.