A patient walked into clinic wearing only a hospital gown, feet bare and EKG wires trailing. Just hours after having surgery, his dementia had prompted him to wander out of the hospital and walk two miles to proudly show off his new surgical scar to a familiar face. Physically unharmed, his heart was easy to fix but his memory was beyond repair.
Though the road to a cure has long seemed insurmountable, dementia advocates have recently found reason to celebrate. Scientists announced this week the development of a new tool that may help identify people who are prone to Alzheimer’s disease, and Bill Gates has made a 100 million dollar pledge to join the fight. These vital research dollars give renewed hope to millions of families who already realize that by the time any kind of dementia is diagnosed, treatment options are incredibly limited.
The aging of populations worldwide is leading to many healthcare challenges, such as an increase in dementia patients. One recent estimate suggests that 13.9% of people above age 70 currently suffer from some form of dementia like Alzheimer’s or dementia associated with Parkinson’s disease. The Alzheimer’s Association predicts that by 2050, 135 million people globally will suffer from Alzheimer’s disease.
While these are daunting numbers, some forms of cognitive diseases can be slowed if caught early enough. The key is early detection. In a recent study, my colleague and I found that machine learning can offer significantly better tools for early detection than what is traditionally used by physicians.
One of the more common traditional methods for screening and diagnosing cognitive decline is called the Clock Drawing Test. Used for over 50 years, this well-accepted tool asks subjects to draw a clock on a blank sheet of paper showing a specified time. Then they are asked to copy a pre-drawn clock showing that time. This paper and pencil test is quick and easy to administer, noninvasive, and inexpensive. However, the results are based on the subjective judgment of clinicians who score the tests. For instance, doctors must determine whether the clock circle has “only minor distortion” and whether the hour hand is “clearly shorter” than the minute hand.
A few weeks ago I wrote a post about the unbelievable cost associated with Alzheimer’s disease and how large a population it is likely to affect. According to an op-ed piece written by Sandra Day O’Connor, among others, it is estimated that by 2050 approximately 13.5 million Americans will be stricken with Alzheimer’s, up from five million today, and that the cumulative price tag for treating Alzheimer’s, in current dollars, will be $20 trillion. In contrast, remember that the cost of our ENTIRE healthcare system today is around $2.4 trillion.
This week there was a follow-up piece in the NY Times entitled, “Tests Detect Alzheimer’s Risks, but Should Patients Be Told?” The article described how new diagnostic tests have become available that make it possible to detect early Alzheimer’s and, more interestingly, to predict more accurately one’s likelihood of getting Alzheimer’s in the future. The focus of the article was the moral and ethical dilemma presented by the availability of this knowledge.
Since there is no known treatment for Alzheimer’s and none on the short term horizon, physicians with knowledge of a patient’s Alzheimer’s risk are put in an interesting spot. If they tell their patients the bad news, it may have a profound negative effect on their psyche and lead to debilitating depression; if they don’t tell, they are withholding information that might enable a person to prepare their life more effectively to deal with the oncoming challenges. As the article so well articulates:
“Modern medicine has produced new diagnostic tools, from scanners to genetic tests, that can find diseases or predict disease risk decades before people would notice any symptoms. At the same time, many of those diseases have no effective treatments. Does it help to know you are likely to get a disease if there is nothing you can do? “
In the last several weeks I lost my phone (recovered), my iPod (gone) and even a piece of jewelry (I am pretty sure the cat is guilty). I was at the airport when I couldn’t remember where I parked my car for long enough to wonder if I actually did drive myself there. (Don’t judge me; I know you do it too.)
All of us are prone to losing objects and forgetting appointments and struggling for that word on the tip of our tongue that we definitely should know. Sometimes we even forget the names of people who live in our house just for a second; admit it: how many times have you called your child by the dog’s name?
Those momentary lapses of memory can be amusing or frustrating, but they usually don’t slow us down much. We laugh it off and say, “wow, I must be getting old” and move on to the next task. An op-ed I read recently in the NY Times, however, made me realize we don’t long have the luxury of humor when it comes to this issue.
Authored by Supreme Court justice Sandra Day O’Connor (ret.), Nobel Laureate neurologist Dr. Stanley Prusiner and Age Wave expert Ken Dychtwald, and entitled The Age of Alzheimer’s, the article pointed out these astonishing facts:
Starting on Jan. 1, our 79-million-strong baby boom generation will be turning 65 at the rate of one every eight seconds. That means more than 10,000 people per day, or more than four million per year, for the next 19 years facing an increased risk of Alzheimer’s. Although the symptoms of this disease and other forms of dementia seldom appear before middle age, the likelihood of their appearance doubles every five years after age 65. Among people over 85 (the fastest-growing segment of the American population), dementia afflicts one in two. It is estimated that 13.5 million Americans will be stricken with Alzheimer’s by 2050 – up from five million today.Continue reading…