The rAIdiologist will see you now


The year is 2019 and Imaging By Machines have fulfilled their prophesy and control all Radiology Departments, making their organic predecessors obsolete.

One such lost soul tries to decide how he might reprovision the diagnostic equipment he has set up on his narrow boat on the Manchester Ship Canal, musing at the extent of the digital take over during his supper (cod of course).

What I seek to do in this short paper is not to revisit the well-trodden road of what Artificial Intelligence, deep learning, machine learning or natural language processing might be, the data-science that underpins them nor limit myself to what specific products or algorithms are currently available or pending. Instead I look to share my views on what and where in the patient journey I perceive there may be uses for “AI” in the pathway.

For the purposes of this discussion I therefore refer neither to “Artificial” intelligence nor “Augmented” intelligence but have instead coined the term “Applied” intelligence as a moniker I feel more fitting for the broad brush.

Whilst I write primarily from a UK/NHS perspective here, I would suggest many of the challenges and potential use-cases presented may be applicable to other systems. Similarly, some of the broad suggestions (for example “clinical decision support”) may be relevant in slightly different guises at different parts of the pathway.

A Global Solution

The NHS is not alone in operating under the pressor of a relentless increase in demand for imaging diagnostics – far outstripping capacity even when upskilling and skill-mix approaches are taken into consideration – by approximately 10% year on year.

At the same time, provision has transitioned from a single local-hospital based general radiology department meeting all the needs of its populace to multiple sites: be it increased community-based imaging resources or federated specialised centres across a region.

The role of the radiologist has also evolved over the past decade or so, driven in part by the increased importance of MDTs (Tumour Boards) but also in a change in Clinical Practice with more explicit shared decision-making.

Thus, one might consider the main challenges to overcome be represented by seeking to overcome issues with:


  • Increased workload
  • Increasingly varied demands on a radiologist’s time pulling them away from report-churn


  • Between different Institutions in an Enterprise/regional care system (we will include home reporting etc in this domain)
  • Between different departments or specialisms in the same org

I have sought to break the use cases down against 4 broad stages of the patient journey:

Pre-imaging, Image acquisition stage, Reporting Stage, Post-reporting tools

1)  Pre-Imaging

i. Clinical Decision Support – tools at the requesting stage which may guide clinicians to the appropriate single best test or suite of tests for a given presentation or differential

ii. Optimised Scheduling – both within an Enterprise and with patients to route appointments to the most convenience and efficient location and scanner to enhance productivity

iii. Enhanced Digital Communication with patients (including Electronic Consent) – tools to better prepare a patient with information about what a test involves, how to prepare for it and the importance of this.

The summated benefit of these measures would be to eliminate time wasted by poor scanner scheduling, reduce the incidence of no-shows, and pump-prime the information a patient needs for a scan to when they are more receptive rather than during the stressful period of attendance as well possibly as reduced time and support-needs during the scan (for example expectations for positioning etc.

2)  Image Acquisition Stage

i. AI-assisted image acquisition to reduce the time take to scan (for example multi-parametric MRI scans) and reduce the number of poor-quality images thus potentially also improving accuracy and need for recalls

ii. AI-assisted Dose Management – at a macro-level by reducing signal noise to improve image quality of lower dose scans, and at a patient level

iii. Real-time on-scanner Image detection/analysis. This itself could have a number of potential benefits. The vast majority of scans reviewed by radiologists are undertaken “cold” that is when the patient is no longer in the radiology department or usually not even in the hospital (outpatient scanning). Benefits of on-modality analysis may allow stratification for example:

  1. Critical finding that requires immediate/urgent medical attention
  2. Abnormalities that require urgent/expedited reporting
  3. Normal scan – automated reporting of normal examinations for near-contemporaneous feedback to expedite management and earlier reassurance to patients
  4. A subset of the above might even perhaps be detection of changes to known pathology (for example a nodule or cancer follow-up) with either automation of “no change” or prioritisation of “significant change” findings.

3)  Image Interpretation and Reporting

i. Examination-Routing: intelligence worklist management to ensure that examinations are reviewed as quickly and efficiency as possible by the post appropriate person based on rules such as:

  1. Urgent findings
  2. Specialism
  3. Key Performance Indicators/metrics
  4. “Normal” pathways as alluded to above

ii. Optimised Presentation of Imaging – ready for reporting: beyond the bane of radiologist’s life that is “hanging protocols” and “relevant priors” more broadly this would be bringing appropriate investigations, clinical information and findings outside radiology to the reporter’s attention to enhance quality and reduce time wasted from multi-source hunting.

iii. Lesion Segmentation and tracking – yes I recognise there are eleventy billion algorithms in the wild or in development that profess to do this, but instead of “App stores” requiring human intervention to pull individual pieces of software to run and then needing user input to validate each nodule, options could include (but not limited to):

  1. Baked into a natural workflow which (for example) automatically segments out lesions (across the ENTIRE image acquisition not just in individual body part models), measures them, detects changes in prior lesions and presents them as a summarised finding in the report.
  2.  On-demand Analysis Aid: humans are generally poor at differentiating between true +ve and false +ve and so Algorithm  segments “nodules” presented to validate might lead to over-calls. Instead an interactive tool might be  activated on demand to provide a “second opinion” on a region of uncertainty instead of pre-marking multiple regions for a person to accept or reject

iv. Image Analysis Support– this might involve, for example, access to image libraries with suggestions of possible diagnosis of appearances based on pathognomonic features

  1. More specifically this might involve Radionomics features to help classify tumours.
  2. Another example might include an analysis of the attenuation, enhancement characteristics or MR-signal profiles and suggesting the most likely aetiology based on these parameters.
  3. Of course, we should also remember the more prosaic analysis of pathology on plain x-rays (fractures, pneumothoraces etc etc).

v. Natural Language Processing applications might be employed in various guises such as:

  1. Improve the accuracy of voice recognition while reporting and correct typographical errors whilst reporting or deploy suggested-next methodologies to make reporting more efficient.
  2. Automatic generation of report summary based on the body of the text, including details such as auto-inserting TNM stage based on descriptors of pathology.

vi. Report-Creation – the next step from assisted reporting would be independent report creation modules. We are already seeing some of these in development in the Breast Radiology space but possibilities include:

  1. Breast Second Reader applications – helping to address the massive shortage of radiologists and yet with their requirement to double-report mammograms
  2. Full Template Reporting – as we discussed in the image acquisition phase, if the analysis deems an examination is normal there is no reason this could not generate an appropriate report thus potentially massively reducing the reporting burden of the normals. Indeed, this could equally work with (for example) xrays for fractures – coupled with appropriate routing of the reports.

vii. Clinical Decision Support – access to latest pathways and protocols to ensure radiologist advice conforms to current standards (for example for lesion/nodule follow-up guidelines)

4)  Post-Reporting Pathways

This would involve various facets of automatic or optimising routings of the report of its findings such as:

i. Automatic notification to responsible clinicians of critical findings

ii.  Automatically scheduling a case to be discussed at the next appropriate MDT

iii.   Scheduling/requesting appropriate onward examinations based on the examination findings such as PET-CT or interval CT for nodules as per guidelines

The aim of the radiological journey with applied intelligence is that it should result in greater efficiency in the end to end pathway without increasing the administrative burden on users to deploy it. The net result would be faster and more efficient patient-centric imaging. By considering some of the fully automated outcomes for example for normal imaging we could also seek to redress the massive differential between imaging demand and capacity.

Of course, no Applied Intelligence pathway should be deployed without being rigorously tested and validated – much like any new system deployed in the health: human or digital.

Dr. Malik is a consultant radiologist at Royal Bolton NHS Foundation Trust, where he is Trust PACS and Imaging Lead, Associate CCIO and Divisional Clinical Governance Lead. This article originally appeared on South Manchester Radiology here.