Over four years of medical school, a one-year internship, a four-year radiology residency, a one-year neuroradiology fellowship, and now some time as an attending, one of my consistent takeaways has been how well (and thus how badly) the traditional academic hierarchy conforms to The Peter Principle.
The Peter Principle, formulated by Laurence J Peter in 1969, postulates that an individual’s promotion within an organizational hierarchy is predicated on their performance in their current role rather than their skills/abilities in their intended role. In other words, people are promoted until they are no longer qualified for the position they currently hold, and “managers rise to the level of their incompetence.”
In academic medicine, this is particularly compounded by the conflation of research prowess and administrative skill. Writing papers and even getting grants doesn’t necessarily correlate with the skills necessary to successfully manage humans in a clinical division or department. I don’t think it would be an overstatement to suggest that they may even be inversely correlated. But this is precisely what happens when research is a fiat currency for meaningful academic advancement.
The business world, and particularly the tech giants of Silicon Valley, have widely promoted (and perhaps oversold) their organizational agility, which in many cases has been at least partially attributed to their relatively flat organizational structure: the more hurdles and mid-level managers any idea has to go through, the less likely it is for anything important to get done. A strict hierarchy promotes stability primarily through inertia but consequently strangles change and holds back individual productivity and creativity. The primary function of managers is to preserve their position within management. As Upton Sinclair wrote in The Jungle: “It is difficult to get a man to understand something when his salary depends upon his not understanding it.” (which incidentally is a perfect summary of everything that is wrong in healthcare and politics).
The European University (e.g. Italy, Germany, France, England) descended from the Church. The academic hierarchy, reflected in the regalia, has its roots in organized religion.
The American University was a phenocopy of the European University, but the liberal arts college was a unique American contribution, wherein teaching was considered a legitimate academic pursuit. Even the closest analogues in Europe (the colleges of Cambridge and Oxford) are not as purely an educational institution as the American liberal arts college.
The evolution of American medical education (adapted and updated from: Ludmerer KM. Time to Heal, Oxford University Press, Oxford, 1999) may be divided into five eras.
I.The pre-Flexnerian era(1776- 1910) was entirely proprietary in nature. Virtually anyone with the resources could start a medical school. There was no academic affiliations of medical school and no national standards.
II.The inter-war period (1910-1945) was characterized by an uneasy alliance between hospitals and universities. Four major models emerged. In the Johns Hopkins model, led by William Osler, the medical school and the hospital were married and teaching of medicine took place at the bedside. The Harvard model in which the hospitals grew up independently with only a loose alliance with the medical school, represented a hybrid between pre- and post-Flexnerian medical education.Continue reading…