By GREGORY HOPSON

There is no doubt Robert F. Kennedy, Jr. is sincere about wanting to make the world a better place. The Hudson River cleanup, which he helped lead, is one of the most successful environmental achievements in the United States. It had bipartisan support, set global standards, and earned the highest compliment: imitation.
It was quite reasonable to believe Mr. Kennedy could use those same skills and passion to lead Health and Human Services (HHS). He has a proven track record with complex systems, scientific evidence, and protecting public welfare. Even skeptics of his appointment want him to succeed.
But skill sets in one domain do not always translate to another domain—no matter how strong that skillset is. And it can be very difficult to realize this until the effects of the Law of
Unintended Consequences start to complicate things—as is now happening with Mr. Kennedy’s approach to public health. I know the feeling — because I had made the very same mistake.
Lessons from Databases
I had over fifteen years of experience with databases in auto parts, newspapers, manufacturing, and insurance before I started working with healthcare databases. Each domain had its own complex logic but I could adapt from one domain to another relatively easily.
When I started at the University of Iowa’s Department of Anesthesia, I was confident I could make a smooth transition to a new domain as I always had.
My first assignment was simple: create a report of the active prescription medications listed for a patient at a given appointment. It didn’t take very long to figure out how to find patient data, appointment data, and prescription data. My expertise in databases was transferring to a new domain quite smoothly!
All I had to do was use a chart I had and see how to make the connections.
I can read…how hard could that be?
Not only was it harder than I expected it to be, but I also didn’t immediately recognize why.
Parallel Paths
Mr. Kennedy took a similar path with vaccines and autism. He could see patient data. He could see vaccine data. He could see autism data. The connections seemed clear.
In my case, a researcher at Iowa had a theory that the length of a clinical appointment could be predicted by the number of prescriptions a patient was taking. My job was to combine all of the relevant data. He would then use that for his calculations.
I built a dataset. Everything looked right. But I was so new I didn’t realize there were hidden flags that identified appointment types. And flags for prescriptions that were active on the date of the visit. I didn’t even know there was a database flag that identified them. Flag is an oversimplification; it was far more complex than that.
Kennedy thought he had confirmation of his theory in 1998, when Andrew Wakefield and colleagues published a study in The Lancet suggesting a link between the MMR vaccine and autism. It looked right. It seemed obvious. A lot of people believed it. But like my report, it was flawed — a small sample size, uncontrolled design, and speculative conclusions. My initial dataset had “false” data because I missed some flags. My mistake was caught long before the data ever got close to any kind of study. Not only did the Wakefield study include falsified data, it made it to the publication stage.
My researcher kindly showed me my errors and I was fortunate it was early on in the process. Meanwhile, epidemiologists and clinicians have repeatedly shown Mr. Kennedy where his conclusions don’t stand up. Yet, like a friend of mine who once argued astronomy with Dr. James Van Allen — yes, the Van Allen Belts Van Allen — some convictions are hard to let go of, no matter how authoritative the counterevidence.
Three Questions for Transferring Expertise
I have learned to ask myself three questions whenever I enter a new domain — and I think they apply to all of us…including Mr. Kennedy.
Continue reading…