RFID Tags for Nurses. Then Everybody?


The recent City of Ontario v. Quon decision has had a mixed reception among privacy advocates. Though many are disappointed that employees’ privacy rights have once again been narrowed, some have discerned helpful dicta in the case. However, I worry that, whatever the drift of thought among swing justices, economic imperatives and cultural shifts will mean a lot less privacy in the workplace of the future. Health care in particular offers a few interesting bellwethers.

As an opinion piece by Theresa Brown explains, maintaining proper staffing levels in hospitals is becoming increasingly difficult. Surveillance systems are offering one way to address the problem; work can be performed more intensively and efficiently as it is recorded and studied. But such monitoring has many troubling implications, according to Torin Monahan (in his excellent book, Surveillance in a Time of Insecurity):

The tracking of people [via Radio Frequency Identification Tags] represents a . . . mechanism of surveillance and social control in hospital settings. This includes the tagging of patients and hospital staff. . . . When administrators demand the tagging of nurses themselves, the level of surveillance can become oppressive. . . . [because nurses face] labor intensification, job insecurity, undesired scrutiny, and privacy loss. . . . To date, such efforts at top-down micromanagement of staff by means of RFID have met with resistance. . . . One desired feature for nurses and others is an ‘off’ switch on each RFID badge so that they can take breaks without subjecting themselves to remote tracking. (122)

Like the “nannycam” employed by many a wary parent, the nurse-cam may be seen as a way to protect the vulnerable. It may also increase the accuracy of evidence in malpractice cases. On the other hand, inserting a tireless electronic eye to monitor what is already an extremely stressful job may create many unintended consequences, or deter people from going into nursing altogether. Even advocates of pervasive surveillance recognize these difficulties.

The increasing pressure to monitor what happens inside hospitals reminds me of a recent article by Thomas Goetz in Wired (no link yet) on Google co-founder Sergey Brin’s quest to find a cure for Parkinson’s disease. As Goetz describes it, a new form of “high-speed science” depends on rapid accumulation of as much data as possible:

In Brin’s way of thinking, each of our lives is a potential contribution to scientific insight. We all go about our days, making choices, eating things, taking medications, doing things—generating what is inelegantly called data exhaust. . . . With contemporary computing power, that data can be tracked and analyzed. “Any experience that we have or drug that we may take, all those things are individual pieces of information. Individually, they’re worthless, they’re anecdotal. But taken together they can be very powerful.” In computer science, the process of mining such large data sets for useful associations is known as a market-basket analysis.

Goetz has promoted this as a new way to “do science in the petabyte age.”

I had a few responses to these ideas. On the one hand, I do support methods to make electronic health records, once properly anonymized, a foundation for good medical research. But we do need to recognize what Paul Ohm has demonstrated in his recent work: there is an inverse relationship between anonymization and utility for a broad range of data. To use just one example—there may not be that many 6′7″ individuals in a given zip code, but tagging records from such individuals with their height may be a key part of solving certain medical puzzles that researchers are looking at. We can either “cleanse” the data of height information in order to help anonymize the tall (and thus frustrate some research), or we can leave it in and possibly compromise the anonymity of those tall individuals. Having recently spoken to an epigenetics researcher who aspired to track all aspects of the life of a certain group of individuals, I have a sense the latter path is going to be taken more often in the future.

Second, on a cultural level, there is a gradual melding of surveillance programs with a) what Daniel Callahan calls the “research imperative” and b) the rhetoric of war. Nobel Laureate Joshua Lederberg expressed the research imperative in its purest form when he said, “The blood of those who will die if biomedical research is not pursued will be upon the hands of those who don’t do it.” Privacy advocates will need to find equally pithy and dramatic encapsulations of their values if the research imperative is not to run roughshod over extant privacy rights.

Related “war rhetoric” was thoughtfully debated at an Intelligence Squared debate on cyberwar that included Jonathan Zittrain and Bruce Schneier. Your attitude toward military access to internet communications depends a lot on whether you think a disruption of the network will result in mere inconvenience or, say, the collapse of the banking system. When the specter of death or war is invoked, it is difficult for advocates or privacy or workplace autonomy to promote values of comparable importance. However, they can at least try to clarify exactly what interests are motivating the promotion of certain programs of surveillance.

Frank Pasquale is the Schering-Plough Professor in health care regulation and enforcement at Seton Hall Law School and is the Associate Director of the Center for Health & Pharmaceutical Law & Policy. He has distinguished himself as an internationally recognized scholar in health, intellectual property, and information law and has made numerous academic presentations at universities across North America and at the National Academy of Sciences. A prolific writer, Professor Pasquale’s work has been featured in top law reviews, books, peer-reviewed journals, and online blogs, including Health Reform Watch, of which he is Editor-in-Chief. A frequent media presence, he has appeared in the New York Times, San Francisco Chronicle, Los Angeles Times, Boston Globe, Financial Times, and on CNN, WNYC’s Brian Lehrer Show, and National Public Radio’s Talk of the Nation.

Categories: Uncategorized

Tagged as: , , ,

2 replies »

  1. Meaningfully dangerous ideas which enrich the companies but injure the patient. The cameras will show, if the viewers are honest, that patients are neglected in favor of the clicker avid electronic devices. Your theory of population data coming from these flawed devices is that of GIGO.