Uncategorized

How Radiologists Think

flying cadeuciiDiagnostic tests such as CAT scans are not perfect. A test can make two errors. It can call a diseased person healthy – a false negative. This is like acquitting a person guilty of a crime. Or a test can falsely call a healthy person diseased – a false positive. This is like convicting an innocent person of a crime that she did not commit. There is a trade-off between false negatives and false positives. To achieve fewer false negatives we incur more false positives.

Physicians do not want to be wrong. Since error is possible we must choose which side to err towards. That is we must choose between two wrongness. We have chosen to reduce false negatives at the expense of false positives. Why this is so is illustrated by screening mammography for breast cancer.

A woman who has cancer which the mammogram picks up is thankful to her physician for picking up the cancer and, plausibly, saving her life.

A woman who does not have cancer and whose mammogram is normal is also thankful to her physician. The doctor does not deserve to be thanked as she played no hand in the absence of the patient’s cancer. But instead of thanking genes or the cosmic lottery, the patient thanks the doctor.

How about the false negative – the cancer missed on the mammogram? A common reason doctors get sued is missing cancer on mammography.  The false negative is not a statistic but a real person. We promised her early detection of cancer but we failed. It is not surprising that she sues us for breaking our promise.

Now consider the false positive. She doesn’t have cancer. The mammogram flags a possible cancer because of a suspicious finding. Abnormalities on mammograms are seldom binary. There are shades of gray. Because the shade of gray is a suspicious shade, she has an ultrasound and then a biopsy. She is waiting for the results of the biopsy. Her heart is pounding with anxiety. The physician breaks the news to her “no cancer, your biopsy is negative.”

Imagine her relief. Far from being angry with the doctor for taking her in to a rabbit hole she is grateful. That the possible abnormality in her mammogram was not ignored shows that her doctor cares. You can never care too much. You can never be too safe. Better safe than sorry.

This reminds me of the Stockholm syndrome – a curious phenomenon first described in a bank robbery. This is when hostages develop positive feelings for their captors, and have an exaggerated appreciation for acts of unexpected kindness. Is the gratitude of the false positive the medical variant of the Stockholm syndrome?

Doctors are thanked by the false positives but can be sued by the false negatives. Our choice is a no-brainer. Better thanked than sued.

Doctors haven’t stopped being wrong. We just make more tolerable mistakes. But we are not alone. We live in a society obsessed with safety. Precaution is the new morality. False positive is precaution by another name.

Saurabh Jha is a radiologist based in Pennsylvania.

11 replies »

  1. Since we cannot avoid or remove economics from healthcare we should embrace it and use it in a free marketplace that will permit the patient to ultimately decide what is of value to him.

    This does not mean that people die in the streets nor does it mean they cannot be helped by others.

  2. Allan, of course you are correct. Economics does influence care. The point, to me is, that it should not. There is no rational price/outcome relationship. We allow economic principles into the practice in a way that influences, inappropriately, in my view. Economics is a poor model when there is no measure/relation to worth from a patient’s perspective.

  3. Economics has a lot of effect on patient care. Just consider the patient who is treated in an HMO where the physician might be getting bonuses. Alternatively, think of malpractice suits and physicians that take the safer way out. But, more than just economics are incentives where economics is just one of the numerous incentives that affect care.

  4. No problem; I like Steve and loved his books. He is an economist, for sure, and, I admit I don’t think economics has much to do with patient care. It is a misdirection. The point I am trying to make and get us to think about is that perhaps some of our “beliefs” about how we think may be incorrect. The more we know, the more the psychology changes. I don’t believe in the concept of false positive/negative or true positive/negative. That is a misconception of how tests work. I blogged on this and am writing a book on diagnosis for the public. Trying to rethink some of my old ideas about tests and decisions. Thanks for your comments. Bob

  5. rmcnutt, just to make it clear, I was quoting the author. I note the quotation marks are absent. The blog reformats comments initially written and formatted elsewhere. Steve Levitt is an economist. Sorry.

  6. I would have to see the information on the kicks to see what you mean by “more than they should”, but perhaps, if true, it is because goalie’s know that most shots go right or left. They are acting on knowledge of a kicker’s actions, perhaps. If patients knew that the radiologist predictions were meaningless for their outcomes of care, that early detection has not proven benefits, that most shots go right and left so, on average, goalies go right or left making the center more advantageous but once they learn about this the advantage will go away, patients and soccer players would behave differently and our ideas of the psychology of cognition will change. I am not a cybernetics expert, but systems, patients, beliefs, feelings change with knowledge; people are not a closed, economic, predictive algorithm. The knowledge of outcomes changes the psychology. Most of the stuff we think we know about cognition is based on a general lack of knowing. I still think we are thinking wrong on this stuff; fun to talk about but not helpful concepts for patients.

  7. Saurabh, what you say about radiologists is true for sports and elsewhere. Here is a snippet from the Freakonomics blog’s Steve Levitt.

    In my paper with Tim Groseclose and Pierre-Andre Chiappori, we test the predictions of game theory using penalty kicks in soccer. We find that the players’ actions conform very closely to the theoretical ideal.
    There is one big deviation that we see between what players actually do and what the theory predicts: kickers kick the ball right down the middle much less than they should. Or put another way, in practice, kicking it down the middle scores at a higher rate than kicking it either to the left or right (at least in our data set).
    Why? If you kick it right down the middle and you don’t score, it is damn embarrassing. So even though the middle is a great play statistically, kickers don’t choose it very often. There are some things that are even more important than winning, like not looking like a fool.

    (cont) http://freakonomics.com/2006/06/27/in-soccer-it-is-not-whether-you-win-or-lose-but-how-you-play-the-game/

  8. Very thoughtful post; it is also interesting, to me at least, that it is conditioned on the “fact” that the radiologist/physician has something to do with the decision making thereby forcing the physician into a “thank you versus law suit” conundrum. Dr Accad is on the right track; patients are the only ones who can trade-off the false versus true stuff. I find that patients presented the data chose differently than physicians do. Also, are we still sold on the fact that early detection matters? Are we selling too much here? I am glad you posted; we are still stuck in old time thinking; physicians need to step out of the decision role and into a data presenting role. The way radiologists (or any physician) should think is that they should not. We keep posing tests or treatments to patients, perhaps, from the view of our suppositions of what “they” may want or need, or what we think we need in order to protect ourselves – rather than letting patients tell us.

  9. Another thing we do is to try to diagnose something that can be treated. We try to make this diagnosis a little harder than we try to diagnose something that cannot be treated. [E.g.Try to make it a seminoma or a chorio rather than an embryonal ca….if you can. Try to make it an appendiceal carcinoid rather than an adenoca. Try to rule out metastatic melanomas and pancreatic adenos, always.]

    I’m not sure if this is good or bad…it just happens.

  10. Doesn’t change the incentives of the physician at all. What it means is that those who can afford the suggested follow up will do so. Those who cannot, won’t. Which will also be the case for true positives as those who cannot afford to pay will eventually find out much later.

  11. You are correct, Saurabh, but your analysis is conditioned by the current system in which mammography is implemented as a free public health service. Let women actually express their individual preferences for true and false positives (by paying for the mammogram and for its consequences), and the situation will no longer seem like a costly dilemma.