Categories

Tag: Bryan Carmody

Recommendations From the Coalition for Physician Accountability’s UME-to-GME Review Committee: Winners & Losers Edition

By BRYAN CARMODY

If you’re involved in medical education or residency selection, you know we’ve got problems.

And starting a couple of years ago, the corporations that govern much of those processes decided to start having meetings to consider solutions to those problems. One meeting begat another, bigger meeting, until last year, in the wake of the decision to report USMLE Step 1 scores as pass/fail, the Coalition for Physician Accountability convened a special committee to take on the undergraduate-to-graduate medical education transition. That committee – called the UME-to-GME Review Committee or UGRC – completed their work and released their final recommendations yesterday.

This isn’t the first time I’ve covered the UGRC’s work: back in April, I tallied up the winners and losers from their preliminary recommendations.

And if you haven’t read that post, you should. Many of my original criticisms still stand (e.g, on the lack of medical student representation, or the structural configuration that effectively gave corporate members veto power), but here I’m gonna try to turn over new ground as we break down the final recommendations, Winners & Losers style.

Continue reading…

#USMLEPassFail: A Brave New Day

By BRYAN CARMODY, MD

Well, it happened.

Beginning as soon as 2022, USMLE Step 1 scores will be reported pass/fail.

I’m shocked. Starting around two weeks ago, I began hearing rumors from some well-connected people that this might happen… but I still didn’t believe it.

I was wrong.

The response thus far has been enormous – I haven’t been able to clear my Twitter mentions since the news broke. And unsurprisingly, the reaction has been mixed.

In the future, I’ll post more detailed responses on where we go from here – but for now, I’d like to emphasize these five things.

Continue reading…

Why Do We Have Residency Training?

By BRYAN CARMODY, MD

Surely every resident has had the experience of trying to explain to a patient or family what, exactly, a resident is. “Yes, I’m a real doctor… I just can’t do real doctor things by myself.”

In many ways, it’s a strange system we have. How come you can call yourself a doctor after medical school, but you can’t actually work as a physician until after residency? How – and why – did this system get started?

These are fundamental questions – and as we answer them, it will become apparent why some problems in the medical school-to-residency transition have been so difficult to fix.

In the beginning…

Go back to the 18th or 19th century, and medical training in the United States looked very different. Medical school graduates were not required to complete a residency – and in fact, most didn’t. The average doctor just picked up his diploma one day, and started his practice the next.

But that’s because the average doctor was a generalist. He made house calls and took care of patients in the community. In the parlance of the day, the average doctor was undistinguished. A physician who wanted to distinguish himself as being elite typically obtained some postdoctoral education abroad in Paris, Edinburgh, Vienna, or Germany.

Continue reading…

Applying Smarter, Part 1: Breaking Down the AAMC’s Apply Smart Campaign

By BRYAN CARMODY, MD

“YOUR LIKELIHOOD OF SECURING RESIDENCY TRAINING DEPENDS ON MANY FACTORS – INCLUDING THE NUMBER OF RESIDENCY PROGRAMS YOU APPLY TO.”

So begins the introduction to Apply Smart: Data to Consider When Applying to Residency – a informational campaign from the Association of American Medical Colleges (AAMC) designed to help medical students “anchor [their] initial thinking about the optimal number of applications.”

In the era of Application Fever – where the mean number of applications submitted by graduating U.S. medical students is now up to 60 – some data-driven guidance on how many applications to submit would be welcome, right?

Right?

And yet, the more I review the AAMC’s Apply Smart campaign, the more I think that it provides little useful data – and the information it does provide is likely to encourage students to submit even more applications.

This topic will be covered in two parts. In the first, I’ll explore the Apply Smart analyses and air my grievances against their logic and data presentation. In the second, I’ll suggest what the AAMC should do to provide more useful information to students.

Introduction to Apply Smart

The AAMC unveiled Apply Smart for Residency several years ago. The website includes lots of information for students, but the piece de resistance are the analyses and graphics that relate the number of applications submitted to the likelihood of successfully entering a residency program.

Continue reading…

What’s on USMLE Step 1?

By BRYAN CARMODY

Recently, I was on The Accad and Koka Report to share my opinions on USMLE Step 1 scoring policy. (If you’re interested, you can listen to the episode on the show website or iTunes.)

Most of the topics we discussed were ones I’ve already dissected on this site. But there was an interesting moment in the show, right around the 37:30 mark, that raises an important point that is worthy of further analysis.

__

ANISH: There’s also the fact that nobody is twisting the arms of program directors to use [USMLE Step 1] scores, correct? Even in an era when you had clinical grades reported, there’s still seems to be value that PDs attach to these scores. . . There’s no regulatory agency that’s forcing PDs to do that. So if PDs want to use, you know, a number on a test to determine who should best make up their class, why are you against that?

BRYAN: I’m not necessarily against that if you make that as a reasoned decision. I would challenge a few things about it, though. I guess the first question is, what do you think is on USMLE Step 1 that is meaningful?

ANISH: Well – um – yeah…

BRYAN: What do you think is on that test that makes it a meaningful metric?

ANISH: I – I don’t- I don’t think that – I don’t know that memorizing… I don’t even remember what was on the USMLE. Was the Krebs Cycle on the USMLE Step 1?

__

I highlight this snippet not to pick on Anish – who was a gracious host, and despite our back-and-forth on Twitter, we actually agreed much more than we disagreed. And as a practicing clinician who is 15 years removed from the exam, I’m not surprised in the least that he doesn’t recall exactly what was on the test.

I highlight this exchange because it illuminates one of the central truths in the #USMLEPassFail debate, and that is this:

Physicians who took Step 1 more than 5 years ago honestly don’t have a clue about what is tested on the exam.

That’s not because the content has changed. It’s because the memories of minutiae fade over time, leaving behind the false memory of a test that was more useful than it really was.

I’m speaking from experience here.

Continue reading…

USMLE Step 1: Leveling the Playing Field – or Perpetuating Disadvantage?

By BRYAN CARMODY

Let me show you some data.

I’m going to show you the Match rate and mean Step 1 score for three groups of residency applicants. These are real data, compiled from the National Resident Matching Program’s (NRMP) Charting Outcomes in the Match reports.

Ready?

  • U.S. Allopathic Seniors: 92% match rate; Step 1 232.3
  • U.S. Osteopathic Seniors: 83% match rate; Step 1 225.8
  • International Medical Graduates, or IMGs (both U.S. and non-U.S. citizen: 53% match rate; Step 1 223.6

Now. What do you conclude when you look at these numbers?

__

In the debate over the U.S. Medical Licensing Examination’s (USMLE) score reporting policy, there’s one objection that comes up time and time again: that graduates from less-prestigious medical schools (especially IMGs) need a scored USMLE Step 1 to compete in the match with applicants from “top tier” medical schools.

In fact, this concern was recently expressed by the president of the National Board of Medical Examiners (NBME) in an article in Academic Medicine (quoted here, with my emphasis added).

“Students and U.S. medical graduates (USMGs) from elite medical schools may feel that their school’s reputation assures their successful competition in the residency application process, and thus may perceive no benefit from USMLE scores. However, USMGs from the newest medical schools or schools that do not rank highly across various indices may feel that they cannot rely upon their school’s reputation, and have expressed concern in various settings that they could be disadvantaged if forced to compete without a quantitative Step 1 score. This concern may apply even more for graduates of international medical schools (IMGs) that are lesser known, regardless of any quality indicator.”

The funny thing is, when I look at the data above, I’m not sure why we would conclude that IMGs are gaining advantage from a scored Step 1. In fact, we might conclude just the opposite – that a scored Step 1 is a key reason why IMGs have a lower match rate.

Continue reading…