There has been a lot of controversy in health policy circles recently about hospital market consolidation and its effect on costs. However, less noticed than the quickened pace of industry consolidation is a more puzzling and largely unremarked-upon development: hospitals seem to have hit the wall in technological innovation. One can wonder if the two phenomena are related somehow.
During the last three decades of the twentieth century, health policymakers warned constantly that medical technology was driving up costs inexorably, and that unless we could somehow harness technological change, we’d be forced to ration care. The most prominent statement of this thesis was Henry Aaron and William Schwartz’s Painful Prescription (1984). Advocates of technological change argued that higher prices for care were justified by substantial qualitative improvements in hospitals’ output.
Perhaps policymakers should be careful what they wish for. The care provided in the American hospital of 2013 seems eerily similar to that of the hospital of the year 2000, albeit far more expensive. This is despite some powerful incentives for manufacturers and inventors to innovate (like an aging boomer generation, advances in materials, and a revolution in genetics), and the widespread persistence of fee for service insurance payment that rewards hospitals for offering a more complex product.
Technology junkies should feel free to quarrel with these observations. But the last major new imaging platform in the health system was PET , which was introduced into hospital use in the early 1990’s. Though fusion technologies like PET/CT and PET/MR were introduced later, the last “got to have it” major imaging product was the 64 slice CT Scanner, which was introduced in 1998. Both PET and CT angiography were subjects of fierce controversy over CMS decisions to pay for the services.