North Carolina Medicaid recently reported, for the third time, using a third consulting firm, the achievement of massive savings through its patient-centered medical home (PCMH) program, now called Community Care of North Carolina (CCNC). Among other things, CCNC pays the physicians more money in order to encourage and compensate behaviors and processes, including enhanced access to care and case management, to hopefully reduce the need for emergency and inpatient services. (A brief summary of this and past consulting reports appear in the current issue of Modern Health Care. http://www.modernhealthcare.com/article/20120218/MAGAZINE/302189938/1140)
However, the third time is not a charm. Notwithstanding these consultants’ reports — which paradoxically support my contrary conclusions by choosing to ignore the overwhelming data contradicting their own claims – the program is a total failure as far as reductions in cost and inpatient utilization are concerned.
Fact #1: According to the Medicaid and CHIP Payment and Access Commission (MACPAC) report to Congress http://www.macpac.gov/reports, North Carolina is by a significant margin the highest-cost state per capita in its region for adult and for child Medicaid spending. These are the two categories in which the PCMH has been in place the longest. In the “aged” category, in which PCMH had barely been started when the MACPAC data was compiled (and would not affect medical costs noticeably because the state is a “secondary payer” following Medicare, and most Medicaid “aged” spending is custodial anyway), North Carolina is the lowest cost state in the region.
Further, North Carolina is high-cost only for its <65 population covered by Medicaid: according to the Commonwealth Fund, commercial coverage (premium + annual deductible) costs only slightly more than the average for the region. http://www.commonwealthfund.org/Publications/Issue-Briefs/2011/Nov/State-Trends-in-Premiums.aspx
Fact #2: None of the originally projected and subsequently claimed utilization changes are discernable at all in the statewide Medicaid data, let alone on a scale (perhaps 40,000 – 50,000 avoided admissions avoided/year out of 240,000+) required to save the claimed billions. From 2000-2009 (complete calendar 2010 data is not publicly available yet), even as more and more members were enrolled in the PCMH to avoid more and more admissions, the total admission rate for North Carolina was basically unchanged, almost exactly paralleling the experience of low-cost South Carolina, which uses a classic managed care model. South Carolina enjoyed a slightly lower admissions rate in each year, with an even better absolute and relative performance as the decade drew to a close. (Despite the large amount of data for South Carolina, the consultants didn’t use it or any other state as a control for North Carolina.)
Fact #3: Looking at the subcategory of ICD9s comprising the two largest categories of admissions in which the PCMH focused for most of the decade — asthma and diabetes — the same result held true, vs. South Carolina.
Fact #4: Likewise, looking at the subcategory of ICD9s comprising the AHRQ’s list of preventable admissions, the same was true.
Fact #5: “Preventable” is a term of opinion, not science, so that reasonable people may differ on what gets counted in that category. For the purposes of PCMH-preventable (as opposed to wellness program-preventable, for example), let’s define a “preventable” event as (1) a fairly common event (2) that is generally diagnosable and treatable, (3) where early access/intervention makes a major difference, (4) where many of the events are complications of a chronic condition whose management is already being emphasized, (5) where patients don’t have to change their lifestyle but rather just take a pill, and (6) where you don’t need to wait years for results. Perhaps the event most fitting those criteria would be cellulitis. Cellulitis admission rates increased by almost exactly the same percentage over the decade in both states. That was actually a slightly better performance for North Carolina than in the other three comparisons, in that they didn’t do worse than South Carolina.
To summarize the facts, there was no utilization change attributable to the program, and the increased costs of the program apparently cause or at least contribute to North Carolina’s status as the high-cost Medicaid state in the region specifically only for the Medicaid member categories most affected by PCMH.
Yet three well-known and highly taxpayer-compensated teams of consultants arrived at the opposite answer. One might ask, how did the consulting teams refute, address, distinguish or interpret this same data the opposite way, to conclude that billions of dollars were saved over the decade?
Rather than refute the data, all three consulting firms – Mercer, TREO, and Milliman – elected to omit the above data from their reports altogether, without a mention. In other words:
(1) They concluded that North Carolina had reduced its cost substantially without mentioning the federal data showing the relative cost position of the relevant population to be the highest in the region; (2) They concluded that Medicaid inpatient utilization trends had declined substantially without mentioning the federal database of Medicaid inpatient utilization trends showing the opposite.
Because both data sources are in the public domain, readily found and widely used (the AHRQ database from which the utilization statistics are derived is at http://hcupnet.ahrq.gov/HCUPnet.jsp ), one interpretation might be that omitting them implies these consultants know full well their conclusions are unsupportable.
The other interpretation would be that they didn’t know about these databases, though they are well-known to population health outcomes experts, which they held themselves out to be. Also, there had already been a well-publicized presentation by Mathematica on applying this data to North Carolina http://www.ehcca.com/presentations/MedHome20100526/peikes.pdf as well as several other reports saying exactly the same thing, including from the Kaiser Family Foundation http://www.kff.org/medicare/upload/7984.pdf.
What did these consultants do instead? Rather than look at the definitive databases of statewide utilization and cost, they used complex analytic models mostly to find outcomes that any trained observer would immediately conclude to be mathematically and epidemiologically impossible. For instance, Mercer found that the majority of the state’s dollar savings came from infants, a 54% reduction in overall costs in that age category.
This is blatantly wrong four ways, any one of which would be sufficient to reject Mercer’s 54% reduction finding:
(1) A 54% overall reduction in this age bracket would require a mathematically impossible >100% decline in neonatal utilization, since nothing else would be expected to change much;
(2) Mercer never analyzed neonatal utilization to find out whether it even came close to supporting their conclusion;
(3) It didn’t — neonatal utilization in reality was essentially unchanged, according to AHRQ data;
(4) According to the other consultants (Milliman), babies were not enrolled in CCNC in any case, meaning any savings from that category (there were none) would not have been attributable to the program.
Milliman found an overall savings of 15% for adults and children. Since admissions consume no more than half the total cost of adult/child Medicaid spending, and, as Milliman correctly points out, the savings are all in inpatient and ER (ER being a much smaller cost category in which – you guessed it – North Carolina’s utilization still exceeds South Carolina’s), that overall 15% decline would require about a 30% reduction in admissions. Since preventable admissions account for only about 10% of all admissions according to AHRQ, preventable admissions would have needed to decline by 300% (actually a bit less because some people are still not in the CCNC) in order to achieve this 15% overall decline. A 300% decline in anything is mathematically impossible, of course, but preventable admissions didn’t decline at all in any case. Nor did non-preventable admissions.
Two consulting firms, two mathematically impossible answers, two ignored federal databases supporting a contrary conclusion. (The TREO work is omitted for space reasons, but is also unsupportable.) These firms, in the immortal words of the great philosopher Ricky Ricardo, have a lot of ‘splaining to do.
But they’re not. Mercer has never addressed the issue of its impossible findings. Milliman was invited to next Sunday’s presentation of this data at the Thomas Jefferson University conference http://www.populationhealthcolloquium.com/agenda/bookclub.html#miniprecon. They declined, telling Modern Healthcare that they didn’t want to pay the $195 admission.
I then both publicly and privately (and with uncharacteristic grace) offered to finance their travel expenses plus pay them $2000/day to successfully defend their taxpayer-financed study, for which they were already paid more than I make in a year and for which they had access to the state’s data, against my own spare-time observations made without any proprietary state data. They declined again. I guess it wasn’t about the $195 after all.
Finally, how is this program “Medicaid’s Solyndra?” Just like with Solyndra, the federal government is making a “bet” on one project, heavily subsidizing this model, with a 9-to-1 match. The result is the same as Solyndra, except that North Carolina Medicaid will never go bankrupt because it can always get more funds from the state legislature, multiplied by Washington.
Some might cite this example as the poster child for block grants for Medicaid, while others might say in general that consulting firms to evaluate Medicaid outcomes should be hired by the state comptroller’s office rather than the department overseeing Medicaid. If nothing else, this case study suggests that allowing any state agency to hire consulting firms at taxpayer expense to justify its programs creates an inherent conflict of interest, especially when increased program expenses can be passed on to Washington.
One logistical point: The nature of a posting like this is that squeezing in all the exact information is impractical. Therefore I would ask the Usual Suspect THCB commenters who, being beneficiaries of PCMH, intend to defend the consultants’ impossible findings (and defend their choice to omit contradictory data) and, by implication, the lucrative PCMH model, to hear the entire presentation before commenting. Either stream it or show up in person at Thomas Jefferson University’s PCMH conference. Then you can get right on the permanent electronic record with your objections after seeing all the slides. Since the state’s consultants are not, as of this writing, intending to come defend their client on my nickel, you can do it instead on your own.
Al Lewis, widely credited with inventing disease management, is author of the forthcoming Why Nobody Believes the Numbers (John Wiley & Sons, June 2012), the introduction to which may now be downloaded gratis from www.dismgmt.com. He also runs the popular course and certification program for Critical Outcomes Report Analysis http://www.dismgmt.com/certs/cora/self-study and was named the “leading authority on care management outcomes measurement” by the 9th Annual Report on the Disease Management and Wellness Industries (Health Industries Research Co., 2010).