We’ve all heard the big philosophical arguments and debate between rockstar entrepreneurs and genius academics – but have we stopped to think exactly how the AI revolution will play out on our own turf?
At RSNA this year I posed the same question to everyone I spoke to: What if radiology AI gets into the wrong hands? Judging by the way the crowds voted with their feet by packing out every lecture on AI, radiologists would certainly seem to be very aware of the looming seismic shift in the profession – but I wanted to know if anyone was considering the potential side effects, the unintended consequences of unleashing such a disruptive technology into the clinical realm?
While I’m very excited about the prospect and potential of algorithmic augmentation in radiological practice, I’m also a little nervous about more malevolent parties using it for predatory financial gains.
This is what I call the ‘Nightmare on ML Street’ scenario…
Luckily this advertisement is a fake (I mocked it up), but I wanted to depict in absolute clarity the medico-legal nightmare that could potentially unfold.
Let’s assume a couple of things first:
-
- AI/ML algorithms get to a point at which they are clinically validated to be super-human. I think this is almost a definite, given enough time. We aren’t quite there yet – after all, there hasn’t been a single randomised control trial or Phase III study demonstrating this – but these are coming. Plenty of early evidence suggests that algorithms are indeed more accurate than radiologists at finding specific things if trained on enough high-quality annotated data.
-
- 2. Medical-negligence litigation allows for retrospective analysis of patient records, including imaging studies. This is currently the case, and is the bread and butter of ‘No Win, No Fee’ law firms preying on disgruntled patients who feel their care wasn’t up to their expectations. In a court of law these companies simply have to prove beyond reasonable doubt that a doctor acted negligently and under-performed according to their codes of practice. This is done by simply bringing in an expert witness to confirm that an error has been made. In radiology, this is made extremely easy, as radiologists are quite literally the only profession who take pictures of their mistakes.
3.A radiology report is a written, digitally signed and traceable document that binds the radiologist to their opinion. Once written (and unless an amendment is made and the changes communicated) that report is an immutable part of the patients’ care record, and stands as a piece of the overall diagnostic process. If the final diagnosis is wrong, or even completely omitted, then there is both pictorial and written evidence to prove a medical error has occurred.
- 2. Medical-negligence litigation allows for retrospective analysis of patient records, including imaging studies. This is currently the case, and is the bread and butter of ‘No Win, No Fee’ law firms preying on disgruntled patients who feel their care wasn’t up to their expectations. In a court of law these companies simply have to prove beyond reasonable doubt that a doctor acted negligently and under-performed according to their codes of practice. This is done by simply bringing in an expert witness to confirm that an error has been made. In radiology, this is made extremely easy, as radiologists are quite literally the only profession who take pictures of their mistakes.
31% of American radiologists have experienced a malpractice claim in their career, with ‘missed diagnosis’ being the most common. Breast cancers top the list, followed closely by missed fractures and lung cancers. There is even a recent trend towards increasing litigation, in part driven by the inexorable rise in workload and push for efficiency, and consolidation of healthcare providers, ultimately resulting in a perceived reduction in quality.
However, in order to begin a malpractice case, a law firm must be approached by a patient, and the care record requested as evidence. In essence, it is at patients’ discretion whether or not to instigate the legal process.
Here’s where I start to get concerned about AI. Imagine a law firm starts advertising directly to consumers in the exact fashion that accident-chasers do today. Patients are after all allowed to request a copy of their imaging and report, and are free to share them with who they choose. If a law firm can convince users to hand over their care record and images for a ‘free of charge’ checking service using AI, we could start to see a snowballing of retrospective clinical errors being found.
Let’s use an example:
Mr Smith sees an advert online for a no win-no fee service to check if his scans were correctly reported. He downloads his care records from his health provider, including a CT of his chest, and uploads them to the law firm’s server. In a few seconds the algorithms find a 5mm lung nodule and mild degeneration at T10/11, both findings that were not included in the original report. The website recommends a consultation with a lawyer, who then suggests they sue for malpractice. Mr Smith says he has had some back pain, come to think of it, and is now worried he has lung cancer. He gets a repeat CT scan, the nodule is now 8mm. The law suit goes to trial, the original reporting radiologist is called in, and an expert witness agrees with the algorithm that, yes, there is clearly a lung nodule and some spinal degeneration. The judge bangs his hammer, and bingo, a few million dollars are paid out.
So far, nothing particularly untoward. A missed finding has been proven, and law suit has been successful. The only difference here compared to today’s situation is that an algorithm was involved.
(Hopefully no-one missed this diagnosis…)
Here’s where it gets really scary…
If enough patients go through this process, the predatory law firm could start a class action law suit against a healthcare provider. This is a whole new level of legal proceedings. In theory, a class action lawsuit allows the prosecution to request access to whole swathes of medical records, potentially not even related to the patients they represent. This could be done in the name of patient safety, with claims made that the entire radiology department is unfit to practice, and therefore everyone treated there has a right to know.
A judge could rule to allow retrospective analysis of the hospital’s entire back-catalogue of scans – hundreds of thousands of images in one go. AI algorithms could easily process that data in a matter of hours, and potentially find MILLIONS of lung nodules that went unreported. The legal ramifications and size of pay-outs are earth-shattering, career-ending and profession-ruining.
I sincerely hope we never end up in this situation.
Some will say this would never be allowed. Retrospective examination using new tools that weren’t previously available is unfair and unethical — and I would agree. However, what I am fearful of is malevolent third parties using AI just to find the mistakes and errors, and then using a human expert witness to confirm them. In this manner, in a court of law, it is human vs human, just as malpractice suits are conducted today. The only difference is that AI was used to bring the cases to light, at an unprecedented scale.
What I’ve highlighted is entirely theoretical, although not completely impossible. AI has the potential to massively improve the radiological workflow and augment radiologists. If used in the right hands in the right context, AI will revolutionise the field. However, if it gets into the wrong hands and is used against us, we face a truly terrifying future that will only damage our profession and our ability to care for patients.
The interesting ethical question is now – do we as radiologists want to protect ourselves from retrospective algorithmic cross-examination, or are we duty bound to give the best possible care to our patients even if it means finding all of our previous mistakes?
Whatever your viewpoint – it will be interesting to see how the AI landscape unfolds. I’m hoping this nightmare won’t wake us up from the dream of achieving what we set out to do!
About the author:
Dr Harvey is a board certified radiologist and clinical academic, trained in the NHS and Europe’s leading cancer research institute, the ICR, where he was twice awarded Science Writer of the Year. He has worked at Babylon Health, heading up the regulatory affairs team, gaining world-first CE marking for an AI-supported triage service, and is now a consultant radiologist, Royal College of Radiologists informatics committee member, and advisor to AI start-up companies, including Kheiron Medical. This piece was originally published here.
Categories: Uncategorized
I wonder if the IA paranoia just represents the underlying instability of EHR Risk Management. Question: does your working environment regularly publish reports of IT down time per month, serious hacking attempts, or virus sequestration? Probably, its in the league of “see no evil, hear no evil?” I understand shrinking employment from robotic replacement. But, with the pervasive influence of Parkinson’s Law throughout out nation’s healthcare institutions for nearly 50 years, it will be a cold day on the surface of our sun before radiology employment suffers from AI advancement.
.
Surgeons when making difficult decisions don’t want a piece of paper to read, they want to talk with their favorite radiologist or pathologist: trust, cooperation and reciprocity prevail. The process, over time, drives the underlying diagnostic integrity of their respective hospital.
.
The original Parkinson’s Law was defined as: “Work expands to fill the space allocated for its performance.” More currently, it is known as: “Work expands to use the resources available.” It represents the cause of causes for the level of inefficiency underlying our nation’s excessive health spending. Question: do you know when the next renovation of your hospital is scheduled?
The medical profession and the legal professions are on divergent paths (well, probably have been for years), doctors are trying not to overdo testing and procedures, and attorneys are faulting them for not doing enough. It doesn’t make for controlling healthcare costs.