By BRYAN CARMODY, MD
“YOUR LIKELIHOOD OF SECURING RESIDENCY TRAINING DEPENDS ON MANY FACTORS – INCLUDING THE NUMBER OF RESIDENCY PROGRAMS YOU APPLY TO.”
So begins the introduction to Apply Smart: Data to Consider When Applying to Residency – a informational campaign from the Association of American Medical Colleges (AAMC) designed to help medical students “anchor [their] initial thinking about the optimal number of applications.”
In the era of Application Fever – where the mean number of applications submitted by graduating U.S. medical students is now up to 60 – some data-driven guidance on how many applications to submit would be welcome, right?
And yet, the more I review the AAMC’s Apply Smart campaign, the more I think that it provides little useful data – and the information it does provide is likely to encourage students to submit even more applications.
This topic will be covered in two parts. In the first, I’ll explore the Apply Smart analyses and air my grievances against their logic and data presentation. In the second, I’ll suggest what the AAMC should do to provide more useful information to students.
Introduction to Apply Smart
The AAMC unveiled Apply Smart for Residency several years ago. The website includes lots of information for students, but the piece de resistance are the analyses and graphics that relate the number of applications submitted to the likelihood of successfully entering a residency program.
By BRYAN CARMODY
Recently, I was on The Accad and Koka Report to share my opinions on USMLE Step 1 scoring policy. (If you’re interested, you can listen to the episode on the show website or iTunes.)
Most of the topics we discussed were ones I’ve already dissected on this site. But there was an interesting moment in the show, right around the 37:30 mark, that raises an important point that is worthy of further analysis.
ANISH: There’s also the fact that nobody is twisting the arms of program directors to use [USMLE Step 1] scores, correct? Even in an era when you had clinical grades reported, there’s still seems to be value that PDs attach to these scores. . . There’s no regulatory agency that’s forcing PDs to do that. So if PDs want to use, you know, a number on a test to determine who should best make up their class, why are you against that?
BRYAN: I’m not necessarily against that if you make that as a reasoned decision. I would challenge a few things about it, though. I guess the first question is, what do you think is on USMLE Step 1 that is meaningful?
ANISH: Well – um – yeah…
BRYAN: What do you think is on that test that makes it a meaningful metric?
ANISH: I – I don’t- I don’t think that – I don’t know that memorizing… I don’t even remember what was on the USMLE. Was the Krebs Cycle on the USMLE Step 1?
I highlight this snippet not to pick on Anish – who was a gracious host, and despite our back-and-forth on Twitter, we actually agreed much more than we disagreed. And as a practicing clinician who is 15 years removed from the exam, I’m not surprised in the least that he doesn’t recall exactly what was on the test.
I highlight this exchange because it illuminates one of the central truths in the #USMLEPassFail debate, and that is this:
Physicians who took Step 1 more than 5 years ago honestly don’t have a clue about what is tested on the exam.
That’s not because the content has changed. It’s because the memories of minutiae fade over time, leaving behind the false memory of a test that was more useful than it really was.
I’m speaking from experience here.
By BRYAN CARMODY
Let me show you some data.
I’m going to show you the Match rate and mean Step 1 score for three groups of residency applicants. These are real data, compiled from the National Resident Matching Program’s (NRMP) Charting Outcomes in the Match reports.
- U.S. Allopathic Seniors: 92% match rate; Step 1 232.3
- U.S. Osteopathic Seniors: 83% match rate; Step 1 225.8
- International Medical Graduates, or IMGs (both U.S. and non-U.S. citizen: 53% match rate; Step 1 223.6
Now. What do you conclude when you look at these numbers?
In the debate over the U.S. Medical Licensing Examination’s (USMLE) score reporting policy, there’s one objection that comes up time and time again: that graduates from less-prestigious medical schools (especially IMGs) need a scored USMLE Step 1 to compete in the match with applicants from “top tier” medical schools.
In fact, this concern was recently expressed by the president of the National Board of Medical Examiners (NBME) in an article in Academic Medicine (quoted here, with my emphasis added).
“Students and U.S. medical graduates (USMGs) from elite medical schools may feel that their school’s reputation assures their successful competition in the residency application process, and thus may perceive no benefit from USMLE scores. However, USMGs from the newest medical schools or schools that do not rank highly across various indices may feel that they cannot rely upon their school’s reputation, and have expressed concern in various settings that they could be disadvantaged if forced to compete without a quantitative Step 1 score. This concern may apply even more for graduates of international medical schools (IMGs) that are lesser known, regardless of any quality indicator.”
The funny thing is, when I look at the data above, I’m not sure why we would conclude that IMGs are gaining advantage from a scored Step 1. In fact, we might conclude just the opposite – that a scored Step 1 is a key reason why IMGs have a lower match rate.