By BRYAN CARMODY, MD
“YOUR LIKELIHOOD OF SECURING RESIDENCY TRAINING DEPENDS ON MANY FACTORS – INCLUDING THE NUMBER OF RESIDENCY PROGRAMS YOU APPLY TO.”
So begins the introduction to Apply Smart: Data to Consider When Applying to Residency – a informational campaign from the Association of American Medical Colleges (AAMC) designed to help medical students “anchor [their] initial thinking about the optimal number of applications.”
In the era of Application Fever – where the mean number of applications submitted by graduating U.S. medical students is now up to 60 – some data-driven guidance on how many applications to submit would be welcome, right?
Right?
And yet, the more I review the AAMC’s Apply Smart campaign, the more I think that it provides little useful data – and the information it does provide is likely to encourage students to submit even more applications.
This topic will be covered in two parts. In the first, I’ll explore the Apply Smart analyses and air my grievances against their logic and data presentation. In the second, I’ll suggest what the AAMC should do to provide more useful information to students.
Introduction to Apply Smart
The AAMC unveiled Apply Smart for Residency several years ago. The website includes lots of information for students, but the piece de resistance are the analyses and graphics that relate the number of applications submitted to the likelihood of successfully entering a residency program.
The first thing we need to do is get oriented to these figures.
The Apply Smart graphics all take a similar form, so for sake of example, let’s examine the one for U.S. medical graduates who applied to residency programs in internal medicine.
APPLY SMART: INTERNAL MEDICINE
THE X-AXIS
The x-axis shows the number of applications submitted by students applying for residency positions. Pretty straightforward.
THE Y-AXIS
The y-axis is labeled “Probability of Entering a Residency Program.” It corresponds to the percentage of applicants who successfully entered a residency program in internal medicine. Everyone who enters an internal medicine residency program gets counted, regardless of whether they entered through the Match or the SOAP.
THE CURVES
The curves show the probability of entering a residency program in that specialty for the group of candidates who applied to x number of programs.
The curves were created using spline regression. By tracing each curve, we can see how that probability of residency entry changes with the number of applications submitted by different groups of candidates.
The graphic for internal medicine (and most other specialties) has three curves. Each corresponds to a group of applicants with a particular range of USMLE Step 1 scores (bottom, middle, or upper tertile).
THE “POINT OF DIMINISHING RETURNS”
Each curve has a point labeled as the point of diminishing returns – the point at which the relationship between the number of applications submitted and the probability of entering a residency program changes. (This was calculated as the first knot in the spline regression.)
Up to the point of diminishing returns, the group of applicants who submit x+1 applications have a higher probability of entering an internal medicine residency program when compared to those who submit just x applications. After the point of diminishing returns, the probability of entering residency program stops increasing with each additional application.
WHAT ARE WE SUPPOSED TO CONCLUDE FROM THE APPLY SMART GRAPHICS?
Here’s what we’re supposed to take away from these graphics. Let’s use the emergency medicine graphic as an example.
APPLY SMART: EMERGENCY MEDICINE
Suppose I want to apply in emergency medicine, and I have an ‘average’ USMLE Step 1 score (middle tertile, 221-237). That would put me on the gray curve in the chart above, and would place my point of diminishing returns at 23 applications (with a confidence interval of 22-25).
Bear in mind, the mean number of applications submitted by U.S. MD graduate applying in EM in 2018-2019 was 51. But according to the AAMC, applying to more than 23 programs will not significantly increase my probability of entering an EM residency program – so why waste my time and money?
At first glance, this sounds great. Maybe if all applicants used the Apply Smart graphics, they’d stop applying to so many programs, and we’d finally break Application Fever? Right?
I mean, why not?
My issues with Apply Smart
I’ve got a few.
I’ll begin with more philosophical issues before highlighting the major methodological flaw that is likely to prevent the Apply Smart campaign from doing anything other than stimulating students to submit even more applications.
PROBLEM #1 – PERPETUATION OF MISINFORMATION
The official video for Apply Smart states:
You’ve probably heard that residency slots aren’t growing at the same rate as graduating medical students – so an already complex and competitive situation has become, well, even more complex and competitive.
Here’s the problem. It’s not true.
Here’s what is true. There are more residency applicants than there are PGY-1 residency positions. (That’s been the case since 1992.)
What is not true is the disparity between applicants and available positions is widening.
SCREENSHOT FROM THE OFFICIAL APPLY SMART VIDEO, IN WHICH A MEDICAL STUDENT WITH A HANGDOG FACIAL EXPRESSION PONDERS THE WIDENING DISPARITY BETWEEN MEDICAL SCHOOL GRADUATES AND RESIDENCY POSITIONS. (CHEER UP, BRO! IT’S NOT TRUE!)
The number of medical graduates is increasing – but so are PGY-1 residency spots. So the best way to look the disparity between the two is by calculating the number of available positions per applicant.
Since 1996, that ratio has been relatively stable around 0.8 positions per Match applicant (range: 0.75-0.86). In fact, there has been an improving trend over the last 5 years or so, as you can see from the graphic below (taken from the 2019 NRMP Match Results & Data report).
Notice that, for U.S. allopathic seniors, there is a substantial (and increasing!) surplus of residency positions. In fact, in 2019, there were 1.70 residency positions available for every graduating American M.D. – which is the highest it’s been in over 40 years.
Match rates are improving, too. (Not for U.S. seniors – because they’ve had a Match rate of 92-95% since 1982. It’s kind of hard to go up from there.) But if you look at the overall match rate – including all applicant types – it’s increased from 71% in 2008-2009 to 80% in 2019.
Obviously, these statistics do not fit the common narrative that the residency job market is tightening. Application Fever is not driven by a major change in the competitiveness of the residency selection marketplace. (As I’ve discussed before, the perception of increased competition occurs because not all programs are equally desirable to applicants, and applicants realize that ‘overapplying’ to programs provides a relative advantage against their peers in securing a desired commodity.)
So I take issue with the AAMC framing their campaign with this kind of misinformation. Claiming that residency positions aren’t growing at the same rate as applicants is neither fact-based nor helpful if your goal is to encourage candidates to submit a rational number of applications.
PROBLEM #2 – CORRELATION DOES NOT EQUAL CAUSATION
The Apply Smart data are observational. The AAMC did not create these curves by randomizing applicants to apply to x number of programs, and then measuring their match rate. They just observed what happened in the real world.
Thing is, students do not apply to a certain number of residency programs at random. Whether a student applies to 5 programs or 105, there are reasons for it. Students who choose to apply to more programs likely differ systematically from those who choose to apply to fewer in myriad ways that impact their attractiveness to program directors.
In fact, some of the Apply Smart graphics show an interesting pattern for students applying to the largest number of programs. Take, for instance, this one, for students applying in anesthesiology.
APPLY SMART: ANESTHESIOLOGY
Look at the yellow curve – the one for applicants with USMLE Step 1 scores >/= 237. Notice how it bends downward on the right-hand side of the graphic? In fact, it looks like it would fall significantly below the curve for applicants with lower USMLE scores if the x-axis were extended, doesn’t it?
Would these students have had more success in the match if they’d only applied to fewer programs? I doubt it. There’s something else going on there. More likely, relatively weaker students choose – very reasonably – to apply to more programs to maximize their chance of matching. Stronger candidates just as reasonably choose to apply to fewer.
Simply put, we cannot use the Apply Smart data to conclude that a student can increase his likelihood of successfully entering a residency program by applying to more programs, or that another student will have the same probability of success if she applies to fewer. Observational data just do not support that kind of conclusion. Instead, we can only conclude that the type of candidate who applies to x programs has a certain probability of entering a residency program in that discipline.
PROBLEM #3 – BIAS
And now it’s time to discuss the biggest problem with Apply Smart – the probabilities are all biased.
To show you how, let’s start by examining the Apply Smart graphic for my field: pediatrics.
I love pediatrics. It’s a great field, and I’d recommend it highly. But as residencies go, it’s not exactly a competitive one. There a lot of very good residency programs, and each one takes a lot of residents. Sure, if you’re hell-bent on matching at one of the so-called “top programs”, then pediatrics is as competitive as anything else. But outside of that, a capable U.S. medical graduate shouldn’t have trouble finding a good match.
So look at the Apply Smart graphic closely, and explain to me what is going on on the left side of the chart.
YIKES! LOOK AT THOSE LOW PROBABILITIES FOR STUDENTS WHO APPLY TO <10 PROGRAMS!
According to this, U.S. allopathic medical graduates who apply to 5 pediatric residency programs have only a 40% chance of successfully entering a residency program. Ouch! Why didn’t someone tell them to apply to more programs?
Of course, something about this doesn’t add up.
Intuitively, we might expect that the applicants who apply to the fewest programs would enjoy the highest success rates. Given the high cost of going unmatched, you’d think that a student would apply to only 2 programs if she were certain that she’d get into one of them, right?
Remember, also, that under the ERAS fee schedule, applying to 10 programs costs exactly the same amount as applying to 1.
So who are these people who are applying to <5 programs and accepting a 20-40% success rate for residency entry?
I’ll tell you who. People who don’t really care if they match in pediatrics or not.
See, even though the y-axis is labeled “Probability of Entering a Residency Program,” the analyses are specialty-specific. That is, a candidate who applied in pediatrics but matched in another specialty is considered as not having entered a residency program.
The problem is, many students apply in more than one specialty. Check out the ERAS Cross Specialty Applicant Data below.
IN 2018, AT LEAST ONE U.S. MEDICAL STUDENT APPLIED TO EVERY SINGLE COMBINATION OF SPECIALTIES LISTED ON THIS TABLE.
Maybe there are 410 students who honestly couldn’t decide between their love for anesthesiology and and their passion for pediatrics, and 332 students who were completely torn between a career as a pediatrician and one as a general surgeon. Or maybe, just maybe, most of these candidates are applying to pediatrics as a backup.
And because most of these candidates are well-qualified, and will end up matching in their preferred specialty, they only submit a handful of applications in their backup specialty.
So that’s why the Apply Smart analyses all look the way they do, with an increasing probability of residency entry up to the ‘point of diminishing returns.’ It’s because the data are biased by backup applications.
(In fact, I have a hunch that if the AAMC removed the backup specialty applications and instead ran the analysis using only applicants in their preferred specialty, there would be no “point of diminishing returns.” Think about it: if you had your heart set on being a general surgeon, but you only applied to one surgery program, you’d have to be pretty darn sure you were going to match there, right?)
PROBLEM #4 – MORE BIAS
Including backup applicants doesn’t just bias the determination of the ‘point of diminishing returns’ – it biases the rest of the analysis as well.
To demonstrate how, let’s take a look at the graphic for diagnostic radiology.
APPLY SMART: DIAGNOSTIC RADIOLOGY
Imagine you’re a would-be radiologist with Step 1 scores in the upper tertile, and you want to figure out how many applications you should submit.
So you look at the yellow line in the graphic above, see that the point of diminishing returns is at 20 applications. (Not bad, when you remember that the average U.S. medical student applying in diagnostic radiology submitted 49 applications in 2019.)
But wait. What’s that dashed line extending to the left from the point of diminishing returns? What does that mean?
Oh. Wait a second.
According to Apply Smart, that little dashed line means that even candidates with the highest tertile of USMLE scores have only around a 65% chance of successfully entering a diagnostic residency program when they apply at their point of diminishing returns.
Is diagnostic radiology really that competitive?
No. It isn’t.
In reality, the Match rate for U.S. seniors in 2019 was 89%. (And bear in mind, that’s the Match rate for all comers, not just those in the top third of Step 1 scores.)
Again, this is due to the failure of Apply Smart to exclude candidates who are applying to backup specialties. The net effect is that the asymptote of the probability curves falls below the actual Match rate for every specialty.
FOR EVERY SPECIALTY, THE MAXIMUM PROBABILITY OF RESIDENCY ENTRY ACCORDING TO APPLY SMART IS SIGNIFICANTLY LOWER THAN THE ACTUAL MATCH RATE.
The psychology here is powerful.
Not long ago, on a slow day in clinic, I showed the Apply Smart graphics to a few medical students, and asked them what they took away from the graphics.
One student – a young woman who wanted to be an anesthesiologist – looked at the graphic for that specialty and identified her point of diminishing returns as 18 applications. She smiled and looked relieved – her advisor had suggested that she apply to 30-35 programs.
DON’T BELIEVE YOUR EYES. IN REALITY, 96% OF U.S. MD APPLICANTS SUCCESSFULLY MATCHED IN ANESTHESIOLOGY IN 2018.
And then, without any prompting, she noticed that submitting 18 applications was seemingly associated with only a 70% chance of getting into an anesthesia program.
Her face clouded with worry.
I asked how many programs she thought she’d apply to. “Probably 30 or 35,” she said.
WHAT IS THE LIKELY EFFECT OF APPLY SMART?
The AAMC is quick to point out that, for almost all specialties, the number of applications submitted by the average student is greater than the point of diminishing returns. Therefore, the Apply Smart data should result in medical students submitting fewer applications.
Will it?
I doubt it.
For the reasons I mentioned above, I question how many students who overapply will apply to fewer programs in response to the Apply Smart graphics. (Try convincing a student with $200,000+ in student loans to apply to fewer programs – when doing so appears to confer only a 60-70% probability of matching in their dream specialty. It’s gonna be a tough sell.).
But for a moment, let’s assume that the AAMC is right, and that all the students who are overapplying choose to decrease their applications to their point of diminishing returns.
In that case, the mean number of applications submitted by students would fall. But would it program directors receive fewer applications overall?
Maybe. It depends whether the Apply Smart data encourages other applicants to apply to more programs.
See, for students applying to fewer programs, the interpretation of Apply Smart graphic is unambiguous: you should apply up to your ‘point of diminishing returns’ in order to increase your probability of entering a residency program.
However, if the candidates on the left side of the graph apply to more programs at the same time that candidates on the right side apply to fewer, then the changes offset each other, and program directors will remain just as buried in applications as they are now.
To predict the effect on overall applications, we’d need to know the actual distribution of applications submitted. Then could estimate how many students apply below their point of diminishing returns, and how many apply above it.
The AAMC doesn’t provide this information. But some other studies do.
Here, for instance, is the distribution of applications submitted by current internal medicine residents.
FROM: ANGUS SV, ET AL. AM J MED 2018; 131(4): 447-452. PUBMED
Notice that the number of applications submitted is not a normal distribution. It’s skewed to the right, with a relatively small number of applicants submitting a large number of applications.
Other specialties likely have similar distributions. And because we’re dealing with right-tailed distributions, where the mean is greater than the median, encouraging applicants to increase their applications up to the ‘point of diminishing returns’ will significantly increase the overall number of applications that are submitted.
Check out these data for general surgery, one of the few specialties in which the median number of applications is publicly available:
FROM: JOSHI ART, ET AL. J SURG EDUC 2019. PUBMED
(In light of this, I’ll leave it to you to surmise why the AAMC only reports the mean number of applications that students submit for most specialties, and not the median or interquartile range or other more informative statistics.)
THE BOTTOM LINE
Apply Smart is not going to fix Application Fever. At best, the analyses are biased and largely uninformative. At worst, they’re actually engineered to stimulate an overall increase in applications.
We can do better… and in Part 2, I’ll explain how.
Dr. Carmody is a pediatric nephrologist and medical educator at Eastern Virginia Medical School. This article originally appeared on The Sheriff of Sodium here.
Categories: Uncategorized