Uncategorized

Medicine’s Missing Foundation for Health Care Reform: Part 2 – Medicine and the Development of Science

       

“It is in vain to expect any great progress in the sciences by the superinducing or engrafting new matters upon old. An instauration must be made from the very foundations, if we do not wish to revolve forever in a circle, making only some slight and contemptible progress.”

                                      — Francis Bacon3

   Scientists face a wide gap between limited human capacities and the demands of effective practice.  To bridge that gap, scientists use external tools, such as measuring instruments, the microscope, the telescope, and, in recent decades, the computer.  The same is true of physicians and researchers in the applied science of medicine.  Everything from stethoscopes to advanced imaging devices, for example, make possible clinical observations that are not otherwise within human capacity.  So too, computer technology now makes possible information processing that physicians and researchers could not otherwise accomplish:   

The dominant trend in biomedical science and in medical practice, as in every realm of science, is the increasing value and usage of computers.  The data so painstakingly extracted in past years are now, through progress in biomedicine, produced in such volumes as to require computers just to record them.  The scientist spends more and more time using the computer to record, analyze, compare and display their data to extract knowledge.4

   This statement begins by equating medical science and medical practice.  Yet, the examples given are drawn from science, not practice.  Using the computer to extract new knowledge for medical science differs from using it to apply existing knowledge for medical practice.  And, within medical practice, using the computer as a component of medical instruments to enhance the user’s physical capabilities differs from using the computer as an information tool to empower the mind for clinical decision making.

   These distinctions suggest that physicians and scientists differ fundamentally in their approach to limited human capacities.  Physicians recognize limits in their capacity for observation and data processing, but not in their capacity for applying medical knowledge.  Thus, the most advanced, costly and ubiquitous use of computer technology in modern medicine is sophisticated clinical imaging devices.  Through these devices, physicians collect detailed data and use sophisticated software to assemble the data into images of internal organs.  By comparison, physicians rarely use computer software to assemble patient data and medical knowledge into options and evidence for medical decision making.  Instead, physicians rely largely on personal intellect (“clinical judgment”) for this pivotal function.

        A. Intellect and the culture of science

   In contrast to medical practice, science has advanced by developing alternatives to unaided judgment.  These developments made possible intellectual operations that would otherwise be prohibitively laborious and prone to error.  The development of mathematics, for example, was described in these terms by Alfred North Whitehead.  He argued that confining the role of judgment facilitates development of system or method while freeing the mind for tasks where judgment is essential.  Writing of geometry before Descartes, Whitehead observed: “Every proposition has to be proved by a fresh display of ingenuity; and a science of which this is true lacks the great requisite of scientific thought, namely, method” (emphasis added).5  Writing of algebra, he observed that using symbols in equations “is invariably an immense simplification. … by the aid of symbolism, we can make transitions in reasoning almost mechanically by the eye, which otherwise would call into play the higher faculties of the brain.”  Writing of arithmetic, he explained the simplifying effects of notation:

By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race.  Before the introduction of the Arabic notation, multiplication was difficult, and the division even of integers called into play the highest mathematical faculties.  … Our modern power of easy reckoning with decimal fractions is the almost miraculous result of the gradual discovery of a perfect notation.

   Giving these examples from mathematics, Whitehead then stated a broader principle: “It is a profoundly erroneous truism … that we should cultivate the habit of thinking about what we are doing.  The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them.”6

   A prime example is the invention of writing.  The tools and techniques of writing extend our minds to past thoughts and words without our having to recall them.  Indeed, Gibbon observed that our capacity for “knowledge and reflection” depends in large part on the use of writing:

Without that artificial help, the human memory soon dissipates or corrupts the ideas entrusted to her charge; and the noble faculties of the mind, no longer supplied with models or with materials, gradually forget their powers; the judgment becomes feeble and lethargic, the imagination languid or irregular.  …7

But science requires more than enhancement of personal judgment and imagination.  Also required is the simple capacity to effectively process the raw material of science—information.  For this purpose, the mind is untrustworthy.  Although its powers of instinctive judgment are impressive in some contexts, the mind is “a relatively inefficient device for noticing, selecting, categorizing, recording, retaining, retrieving and manipulating information for inferential purposes.”8  This reality explains why digital information technology represents a turning point in the history of science.  Scientists were quick to exploit advances in that technology as they came about in the second half of the 20th Century.

   Thus far we have discussed how tools and techniques for aiding the mind bridge the gap between human cognitive limits and the demands of science.  But we have not addressed other gaps that science must bridge:  gaps between normal human behaviors and the rigorous habits of careful investigators, between individual, subjective experience and shared, objective knowledge, between limited individual capacities and the greater capacities of social, cooperative endeavors.

   To bridge these gaps, science has developed a variety of social and technical practices.  These practices include enforcing habitual use of tools and techniques to aid the mind.  These practices also include simple standards of thoroughness and reliability.  Disciplined practices and behaviors of this kind are essential to scientific progress:

The dazzling achievements of Western post-Galilean science are attributable not to our having any better brains than Aristotle or Aquinas, but to the scientific method of accumulating objective knowledge. A very few strict rules (e.g. don’t fake data, avoid parallax in reading a dial) but mostly rough guidelines about observing, sampling, recor
ding, calculating and so forth sufficed to create this amazing social machine for producing valid knowledge. Scientists record observations at the time rather than rely on unaided memory. Precise instruments are substituted for the human eye, ear, nose and fingertips whenever these latter are unreliable. Powerful formalisms (trigonometry, calculus, probability theory, matrix algebra) are used to move from one set of numerical values to another.9

These practices introduce reliability, order and transparency to the raw material of science—information.  This is achieved by compensating for the variable habits and limited abilities employed in measuring, recording and manipulating information.  That compensatory function also empowers the mind’s capacities for judgment and imagination, but its first purpose is to enable trustworthy information processing.

   Scientific behaviors do more than bridge the gap between normal human behaviors and the rigorous habits that science requires.  Scientific behaviors also link the individual with other minds, bridging further gaps between individual, subjective experience and shared, objective knowledge, between limited individual capacities and the greater capacities of social, cooperative endeavors.

   This aspect of science is illuminated by Karl Popper’s distinctions among three different realms to which human thought and knowledge relate:  the world of physical objects or states (World 1), the world of mental states or conscious experiences (World 2), and the world of the objective contents of thought, residing not just in the mind but externally in books, electronic storage, works of art and elsewhere (World 3).  World 3 has objective content existing independently of the mind.  Moreover, “World 3 is autonomous:  in this world we can make theoretical discoveries in a similar way to that in which we can make geographical discoveries in World 1.”10   Popper’s view departs from traditional epistemology.  “Traditional epistemology has studied knowledge or thought in a subjective sense—in the sense of the ordinary usage of the words ‘I know’ or ‘I am thinking.'”  Popper distinguished knowledge in this subjective sense from scientific knowledge.  “While knowledge in the sense of ‘I know’ belongs to [World 2], the world of subjects, scientific knowledge belongs to [World 3], the world of objective theories, objective problems and objective arguments.”11 Popper characterizes scientific knowledge in terms of theories, problems and arguments because scientific knowledge is conjectural and always potentially subject to refutation.

   By moving knowledge from World 2 to World 3, we create new opportunities to access knowledge, test it and apply it to human needs.  Moving knowledge from World 2 to World 3 thus fosters an evolutionary process of natural selection, with both errors and new knowledge coming to light.

   Consider technologies like the printing press and the computer, techniques like decimal notation, and simple practices like recording data at the time of observation instead of relying on unaided memory—they are powerful because they accelerate the movement from World 2 to World 3.  This movement is central to the culture of science.

   Remarkably, Francis Bacon envisioned these dimensions of scientific culture at its birth four hundred years ago. As the first thinker who systematically examined the mind’s role in the advancement of science, Bacon recognized that external aids to the mind are pivotal:

The unassisted hand and the understanding left to itself possess little power. Effects are produced by means of instruments and helps, which the understanding requires no less than the hand … those that are applied to the mind prompt or protect the understanding. … The sole cause and root of almost every defect in the sciences is this, that while we falsely admire and extol the powers of the human mind, we do not search for its real helps.12 

   Bacon reacted against academic and ecclesiastical dogma, with its static dependence on the minds of ancient authorities (Aristotle in particular) and its sterile mode of inquiry (formal, Scholastic disputation).  He became deeply skeptical of abstract thought divorced from observation and experience, writing:  “… we must bring men to particulars, and their regular series and order, and they must for a while renounce their notions and begin to form an acquaintance with things.13  The learning from experience by those engaged in commercial and practical activities enormously impressed Bacon.  He also witnessed a flowering of intellectual life outside the universities. He came to view science and practical learning as cumulative, collaborative activities that escape the limits of received authority and the individual mind.14

   Bacon saw a path that led away from the alchemy and astrology of his time and towards remarkable advances in science and technology over the last four hundred years.  That progress has involved a symbiotic, evolving relationship among the creative minds of individuals, tools and practices for observation and experiment, social practices for systematic feedback on received knowledge, market and non-market systems for generating, disseminating and applying advances in knowledge, and finally, in recent decades, revolutionary information technologies that empower the human mind by providing an alternative to its limited capacities.

   Analysis of the limits of the mind was central to Bacon’s philosophy.  Anticipating several currents of 20th century thought, he identified four “idols of the mind” that distort human thinking and perception:

  •   universal mental limitations “inherent in human nature”;
  •   each person’s disposition and acquired beliefs; each “has his own individual den or cavern, which intercepts and corrupts the light of nature”;
  •   the limits of language, which “force the understanding, throw everything into confusion, and lead mankind into vain and innumerable controversies and fallacies”;
  •   “various dogmas” in philosophy and the sciences, “which have become inveterate by tradition, implicit credence and neglect.”15

   Bacon understood that for both the individual and society, overcoming these idols of the mind was a difficult challenge.  “Our only remaining hope and salvation is to begin the whole labour of the mind again; not leaving it to itself, but directing it perpetually from the very first, and attaining our end as it were by mechanical aid.“16

   Some readers may dismiss this notion as a mere rationalization for “cookbook medicine.”  The reality is the opposite.  Cookbook medicine results from the weaknesses of the unaided mind.  Unwarranted variations in practice exist, because each practitioner writes a personal cookbook.  Evidence-based medicine seeks to replace that variation with uniformity, but it does so by failing to take into account the medical uniqueness of each patient.  That uniqueness can be taken into account only when scientific rigor is brought to medical practice.

         B. Intellect and the culture of medicine

   The first of Bacon’s idols of the mind—universal mental limitations “inherent in human nature”—has been studied in modern cognitive psychology for more than half a century.  Yet, this school of thought in psychology competes with another school of thought showing that the mind has impressive powers of instinctive judgment in some contexts, including medicine.17  This research suggests that external tools cannot replicate instinctive judgment or the “tacit knowledge” on which it relies.  At the same time, yet another school of thought in cognitive psychology discounts the power of both instinctive judgment and deliberate judgment (with the explicit knowledge on which it relies) as bases for expert decision making (in medicine and other fields).  This school of thought has studied two modes of combining items from a data set about an individual or group for a predictive or diagnostic purpose.  One method relies on human expert judgment based on informal contemplation and sometimes discussion with others (case conferences, for example).  The other method relies on formal, algorithmic, objective procedures (weighted sums of predictive factors, for example).   Empirical comparisons show that the latter, mechanical method is usually equal or superior to the former, judgmental method.18 “In fact, there is even evidence that when [mechanical] aids are offered, many experts attempt to improve upon these aids’ predictions—and they do worse than they would have had they “mindlessly” adhered to them.”19

   Separately from this research in cognitive psychology, clinicians have spent several decades attempting to develop software tools to replicate the deliberate judgment of highly trained and experienced physicians, using the analytical powers and explicit knowledge of pathophysiology on which they rely.  These efforts have had little impact.20 That outcome is consistent with a general critique of formal, rule-based approaches to expert decision making in many fields.  In medicine this critique has been directed at clinical protocols, statistical decision analysis and computer-based tools.  As summarized in a study by Marc Berg, this critique idealizes the “art of medicine” and physician autonomy:

Decision-analytic techniques … are but poor representations of the complexities that go into real-time decision making.  One cannot separate the decision from its context … Such rigid, pre-determined schemes [as protocols] are said to threaten the physician’s “art” by dehumanizing the practice of medicine and by reducing the physician to a “mindless cook” …  Moreover, such tools open the way for increased and uninformed controls by “outsiders.”  … All in all, these critics argue, the tools’ impoverished, codified versions of physicians’ know-how do not do justice to the intricate, highly skillful nature of medical work.  The idea of creating formal tools that make medical decisions is utterly mistaken.  Every attempt to take practical control of the decision process out of the physician’s hands is doomed to fail — and is dangerous.21

     This resistance to formal, rule-based approaches to decision making has been opposed, during the past two decades, by evidence-based medicine.  During that period, critics of the status quo have come to focus on the cognitive vulnerabilities first identified by Francis Bacon (although Bacon usually goes unmentioned). Galvanized by the patient safety movement, the culture of medicine is now acutely aware that epidemics of cognitive error in medicine result from the mind’s normal propensities.  But this awareness has (until recently) focused on execution of medical decisions rather than the decision making process.22  And this awareness has not led physicians to embrace electronic information tools to aid decision making.

      Indeed, physician training, credentialing and functioning remain fundamentally unchanged—even though cognitive error in medicine is now recognized as epidemic, even though consensus has developed on the need for electronic medical records and other “health information technology,” even though health care institutions increasingly use digital technologies for storing, retrieving and communicating information, even though caregivers and patients use the Internet to gain unprecedented access to medical knowledge, and even though enormous amounts of time and money are being expended to develop networks of interoperable health information technologies among disparate systems and institutions. Despite these advances, the physician’s mind remains heavily burdened with the core function of processing information—applying comprehensive general knowledge to inform selection and analysis of patient-specific data in the clinical encounter.

   Given this state of affairs, the question arises whether scientific practice (Bacon’s concern) and medical practice differ in some way that justifies the limited use of information technology in medical decision making.  The answer is that the domains of science and medicine do indeed differ, but, rather than justifying current medical practice, the differences highlight its failure.

   Scientists and practicing physicians engage in fundamentally distinct problem solving activities, in terms of both purpose and context. First, in terms of purpose, as Chris Weed has observed23, scientists seek to discover knowledge while practitioners seek to use established knowledge for solving more-or-less familiar problems. Although each patient is unique, many patient problems are sufficiently familiar so that established knowledge can often be applied effectively.  Unfamiliar problems may arise that are truly inconsistent with or unencompassed by established knowledge. But practicing physicians are not expected to develop new knowledge about these truly unfamiliar situations.  Instead, physicians seek to apply established knowledge as well as possible to situations that resemble prior practice.

   Second, beyond this difference in purpose, scientists and physicians act in very different contexts.  Research environments shelter scientists from difficulties that practitioners must cope with on a daily basis.  Scientists choose the problems to investigate, they have the time and resources to pursue those problems in depth, and they create controlled conditions needed to isolate and understand relevant variables.  Scientists thus work under ideal conditions for human judgment.  In contrast, practicing physicians must function without the luxuries of choice, ample time, sufficient resources and controlled conditions.  Physicians may not choose which patients they wish to care for, o
r which patient problems they wish to investigate.  Physicians may devote only limited time and financial resources to each patient, in comparison to what scientists may devote to their investigations.  And physicians have little opportunity to create controlled conditions for isolating variables of interest.  On the contrary, physicians must care for complex patients with multiple interacting variables. Each patient thus represents a unique combination of countless variables.  That individuality demands rapidly taking into account an enormous amount of medical knowledge and correspondingly detailed patient data.

   A further difference from scientific investigation is that medical practice is even more vulnerable to the universal mental weaknesses that Bacon identified and cognitive psychology has studied.  Medicine involves human situations where personal experience makes indelible impressions (for example, a physician who saves a patient’s life with a chosen therapy and then uncritically uses that therapy with other patients for whom it may not be the best option).  At the same time, medicine involves a vast body of knowledge that is at once too complex for anyone to fully comprehend and yet not complex enough to fully capture the realities of individual patients.   Practitioners, usually operating under severe time pressures, apply whatever knowledge enters the mind at the point of care.  Often that is not the precise knowledge most applicable to the unique patient but rather fragments of personal knowledge and beliefs evoked in the physician’s mind by limited data.   Francis Bacon described the psychological process involved:

The human understanding is most excited by that which strikes and enters the mind at once and suddenly, and by which the imagination is immediately filled and excited.  It then begins almost imperceptibly to conceive and suppose that everything is similar to the few objects which have taken possession of the mind; while it is very slow and unfit for the transition to the remote and heterogeneous instances by which axioms are tried by fire, unless the office be imposed upon it by severe regulations, and a powerful authority.24

   At this point some readers may respond that enforcing “evidence-based medicine” provides the “severe regulations” and “powerful authority” needed to break the hold of personal experience on judgment.  But evidence-based medicine in its present form is slow and unfit to move from the population-based generalizations of medical knowledge to “the remote and heterogeneous instances” of unique patients.  Moreover, evidence-based medicine leaves unsolved the “needle in a haystack” problem—the difficulty of coupling vast knowledge with detailed data to find the crucial combinations of details relevant to an individual patient.

   This state of affairs continues to exist in medicine in large part because of economic and legal factors.  Physicians’ legal monopoly on medical decision making has blocked forces of competition that might otherwise have brought reform long ago.  Indeed, for more than half a century, free market theorists have recognized the need to remove educational and credentialing barriers to competition in medicine.25  But these barriers are entrenched. They protect the professional status quo in terms of power, money, status and self-image.  As the cognitive psychologist Robyn Dawes has observed:

States license psychologists, physicians and psychiatrists to make (lucrative) global judgments of the form “It is my opinion that …” [P]eople have great misplaced confidence in their global judgments, a confidence that is strong enough to dismiss an impressive body of research findings and to find its way into the legal system.26

Denial of cognitive limitations, and reluctance to employ external tools, are not limited to medicine.  Psychologists have examined these phenomena in many fields.  Dawes explains that these phenomena reflect emotional needs:

The greatest obstacle to using [external aids] may be the difficulty of convincing ourselves that we should take precautions against ourselves … .   Most of us … seek to maximize our flexibility of judgment (and power).  The idea that a self-imposed external constraint on action can actually enhance our freedom by releasing us from internal and undesirable internal constraints is not a popular one.  …  The idea that such internal constraints can be cognitive, as well as emotional, is even less palatable.  …27

   In medicine, expert judgment is idealized.  In the words of one distinguished clinician, “application of knowledge at the bedside is largely the function of the sagacity inherent in or personally developed by the individual physician.” 28  Sherwin Nuland has further described this ideal of personal sagacity:  “every doctor’s measure of his own abilities … the most important ingredient in his professional self-image” is “to understand pathophysiology” and thereby “to make the diagnosis and to design and carry out the specific cure.”29

   Science seeks to protect against this kind of reliance on individual cognition.  Protection against this reliance is needed in medicine even more than in science, because of the time and resource constraints and the financial influences that operate in medical practice.  In this regard, medicine resembles the domain of commerce.  This comparison is important because medicine lags far behind the domain of commerce in serving individual needs reliably and efficiently, without unnecessary use of scarce resources.

> Part 3 – Economy of Knowledge in Decision Making

Categories: Uncategorized

Tagged as: