
By KIM BELLARD
I feel like I’ve been writing a lot about futures I was pretty worried about, so I’m pleased to have a couple developments to talk about that help remind me that technology is cool and that healthcare can surely use more of it.
First up is a new AI algorithm called FaceAge, as published last week in The Lancet Digital Health by researchers at Mass General Brigham. What it does is to use photographs to determine biological age – as opposed to chronological age. We all know that different people seem to age at different rates – I mean, honestly, how old is Paul Rudd??? – but until now the link between how people look and their health status was intuitive at best.
Moreover, the algorithm can help determine survival outcomes for various types of cancer.
The researchers trained the algorithm on almost 59,000 photos from public databases, then tested against the photos of 6,200 cancer patients taken prior to the start of radiotherapy. Cancer patients appeared to FaceAge some five years older than their chronological age. “We can use artificial intelligence (AI) to estimate a person’s biological age from face pictures, and our study shows that information can be clinically meaningful,” said co-senior and corresponding author Hugo Aerts, PhD, director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham.
Curiously, the algorithm doesn’t seem to care about whether someone is bald or has grey hair, and may be using more subtle clues, such as muscle tone. It is unclear what difference makeup, lighting, or plastic surgery makes. “So this is something that we are actively investigating and researching,” Dr. Aerts told The Washington Post. “We’re now testing in various datasets [to see] how we can make the algorithm robust against this.”
Moreover, it was trained primarily on white faces, which the researchers acknowledge as a deficiency. “I’d be very worried about whether this tool works equally well for all populations, for example women, older adults, racial and ethnic minorities, those with various disabilities, pregnant women and the like,” Jennifer E. Miller, the co-director of the program for biomedical ethics at Yale University, told The New York Times.
The researchers believe FaceAge can be used to better estimate survival rates for cancer patients. It turns out that when physicians try to gauge them simply by looking, their guess is essentially like tossing a coin. When paired with FaceAge’s insights, the accuracy can go up to about 80%.
Dr. Aerts says: “This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians. How old someone looks compared to their chronological age really matters—individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy.”
I’m especially thrilled about this because ten years ago I speculated about using selfies and facial recognition AI to determine if we had conditions that were prematurely aging us, or even we were just getting sick. It appears the Mass General Brigham researchers agree. “This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age,” said co-senior author Ray Mak, MD, a faculty member in the AIM program at Mass General Brigham. “As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual’s aging trajectory. I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives.”
The researchers acknowledge that much has to be accomplished before it is introduced for commercial purposes, and that strong oversight will be needed to ensure, as Dr. Aerts told WaPo, “these AI technologies are being used in the right way, really only for the benefit of the patients.” As Daniel Belsky, a Columbia University epidemiologist, told The New York Times: “There’s a long way between where we are today and actually using these tools in a clinical setting.”
The second development is even more out there. Let me break down the CalTech News headline: “3D Printing.” OK, you’ve got my attention. “In Vivo.” Color me highly intrigued. “Using Sound.” Mind. Blown.
That’s right. This team of researchers have “developed a method for 3D printing polymers at specific locations deep within living animals.”
Continue reading…