But these systems can be tuned to reflect and address key concerns.
What follows is a list of ten separable concerns, and responsive design strategies. The concept of separation of concerns in technology design offers a path to better health policy. Because each concern hardly interacts with the others, any of them can be left out of the design in order to prioritize more important outcomes. Together, all of them can maximize scientific benefit while enhancing social trust.
An inspector should be assured that a vaccine certificate was not tampered with and that it was issued to the presenter. This need not imply any privacy risk, or even need a network connection. One such method for authenticating vaccine credentials adds a human-recognizable and machine-readable face photo to a standard 2D barcode. It works with paper as well as mobile phone presentations.
The digital divide
For this concern, paper credentials have equity and privacy advantages. Equity, because paper is cheap and well understood. Privacy, because there is no expectation that a person must unlock and show a mobile phone. Digitally signed certificates that also include a photo, like #1 above, can be copied for convenience without risk of fraud.
Thank you, ONC for the opportunity you gave me to speak in June. Also, thank you for the format of your August meeting where the Zoom chat feature offered a wonderful venue for an inclusive commentary and discussion as the talks were happening. Beats lining up at the microphone any day.
Here is a brief recap of my suggestions, in no particular order:
Until scientists discover a vaccine or treatment for COVID-19, our economy and our privacy will be at the mercy of imperfect technology used to manage the pandemic response.
Contact tracing, symptom capture and immunity assessment are essential tools for pandemic response, which can benefit from appropriate technology. However, the effectiveness of these tools is constrained by the privacy concerns inherent in mass surveillance. Lack of trust diminishes voluntary participation. Coerced surveillance can lead to hiding and to the injection of false information.
But it’s not a zero-sum game. The introduction of local community organizations as trusted intermediaries can improve participation, promote trust, and reduce the privacy impact of health and social surveillance.
Balancing Surveillance with Privacy
Privacy technology can complement surveillance technology when it drives adoption through trust borne of transparency and meaningful choice.
As the U.S. reckons with centuries of structural racism, an important step toward making health care more equitable will require transferring control of health records to patients and patient groups.
The Black Lives Matter movement calls upon us to review racism in all aspects of social policy, from law enforcement to health. Statistics show that Black Americans are at higher risk of dying from COVID-19. The reasons for these disparities are not entirely clear. Every obstacle to data collection makes it that much harder to find a rational solution, thereby increasing the death toll.
In the case of medical research and health records, we need reform that strips control away from hospital chains and corporations. As long as hospital chains and corporations control health records, these entities may put up barriers to hide unethical behavior or injustice. Transferring power and control into the hands of patients and patient groups would enable outside auditing of health practices; a necessary step to uncover whether these databases are fostering structural racism and other kinds of harm. This is the only way to enable transparency, audits, accountability, and ultimately justice.
A recent review in STAT indicates that Black Americans suffer three to six times as much morbidity due to COVID-19. These ratios are staggering, and the search for explanations has not yielded satisfying answers.
How should we react to 1,718 pages of new regulation? Let’s start by stipulating the White House and HHS perspective:
“Taken together, these reforms will deliver on the promise to put patients at their center of their own health care — you are empowered with control over your own health care choices.”
Next, let’s stipulate the patient perspective via this video lovingly assembled by e-Patient Dave, Morgan Gleason, and the folks at the Society for Participatory Medicine. In less than 3 minutes, there are 15 patient stories, each with a slightly different take on success.
This piece is part of the series “The Health Data Goldilocks Dilemma: Sharing? Privacy? Both?” which explores whether it’s possible to advance interoperability while maintaining privacy. Check out other pieces in the series here.
Alice makes an appointment in the breast cancer practice using the Mayo patient portal. Mayo asks permission to access her health records. Alice is offered two choices, one uses HIPAA without her consent and the other is under her control. Her choice is:
Enter her demographics and insurance info and have The Platform use HIPAA surveillance to gather her records wherever Mayo can find them, or
Alice copies her Mayo Clinic ID and enters it into the patient portal of any hospital, lab, or payer to request her records be sent directly to Mayo.
Alice feels vulnerable. What other information will The Platform gather using their HIPAA surveillance power? She recalls a 2020 law that expanded HIPAA to allow access to her behavioral health records at Austin Rehab.
Alice prefers to avoid HIPAA surprises and picks the patient-directed choice. She enters her Mayo Clinic ID into Ascension’s patient portal. Unfortunately, Ascension is using the CARIN Alliance code of conduct and best practices. Ascension tells Alice that they will not honor her request to send records directly to Mayo. Ascension tells Alice that she must use the Apple Health platform or some other intermediary app to get her records if she wants control.
Google’s semi-secret deal with Ascension is testing the limits of HIPAA as society grapples with the future impact of machine learning and artificial intelligence.
Glenn Cohen points out that HIPAA may not be keeping up with our methods of consent by patients and society on the ways personal data is used. Is prior consent, particularly consent from vulnerable patients seeking care, a good way to regulate secret commercial deals with their caregivers? The answer to a question is strongly influenced by how you ask the questions.
Here’s a short review of this current and related scandals. It also links to a recent deal between Mayo and Google, also semi-secret. A scholarly investigative journalism report of the Google AI scandal with London NHS Foundation Trust in 2016 might be summarized as: the core issue is not consent; it is a conflict of interest at the very foundation of the information governance process. The foxes are guarding the patient data henhouse. When the secrecy of a deal is broken, a scandal ensues.
The parts of the Google-Ascension deal that are secret are likely designed to misdirect attention away from the intellectual property value of the business relationship.
The Oct. 22 announcement starts with: “U.S. Sens. Mark R. Warner (D-VA), Josh Hawley (R-MO) and Richard Blumenthal (D-CT) will introduce the Augmenting Compatibility and Competition by Enabling Service Switching (ACCESS) Act, bipartisan legislation that will encourage market-based competition to dominant social media platforms by requiring the largest companies to make user data portable – and their services interoperable – with other platforms, and to allow users to designate a trusted third-party service to manage their privacy and account settings, if they so choose.”
Although the scope of this bill is limited to the largest of the data brokers (messaging, multimedia sharing, and social networking) that currently mediate between us as individuals, it contains groundbreaking provisions for delegation by users that is a road map to privacy regulations in general for the 21st Century.
The bill’s Section 5: Delegation describes a new right for us as data subjects at the mercy of the institutions we are effectively forced to use. This is the right to choose and delegate authority to a third-party agent that can manage interactions with the institutions on our behalf. The third-party agent can be anyone we choose subject to their registration with the Federal Trade Commission. This right to digital representation by an entity of our choice with access to the full range of our direct control capabilities is unprecedented, as far as I know.
US healthcare is exceptional among rich economies. Exceptional in cost. Exceptional in disparities. Exceptional in the political power hospitals and other incumbents have amassed over decades of runaway healthcare exceptionalism.
The latest front in healthcare exceptionalism is over who profits from patient records. Parallel articles in the NYTimes and THCB frame the issue as “barbarians at the gate” when the real issue is an obsolete health IT infrastructure and how ill-suited it is for the coming age of BigData and machine learning. Just check out the breathless announcement of “frictionless exchange” by Microsoft, AWS, Google, IBM, Salesforce and Oracle. Facebook already offers frictionless exchange. Frictionless exchange has come to mean that one data broker, like Facebook, adds value by aggregating personal data from many sources and then uses machine learning to find a customer, like Cambridge Analytica, that will use the predictive model to manipulate your behavior. How will the six data brokers in the announcement be different from Facebook?
The NYTimes article and the THCB post imply that we will know the barbarians when we see them and then rush to talk about the solutions. Aside from calls for new laws in Washington (weaken behavioral health privacy protections, preempt state privacy laws, reduce surprise medical bills, allow a national patient ID, treat data brokers as HIPAA covered entities, and maybe more) our leaders have to work with regulations (OCR, information blocking, etc…), standards (FHIR, OAuth, UMA), and best practices (Argonaut, SMART, CARIN Alliance, Patient Privacy Rights, etc…). I’m not going to discuss new laws in this post and will focus on practices under existing law.
Patient-directed access to health data is the future. This was made clear at the recent ONC Interoperability Forum as opened by Don Rucker and closed with a panel about the future. CARIN Alliance and Patient Privacy Rights are working to define patient-directed access in what might or might not be different ways. CARIN and PPR have no obvious differences when it comes to the data models and semantics associated with a patient-directed interface (API). PPR appreciates HL7 and CARIN efforts on the data models and semantics for both clinics and payers.
The rather esoteric issue of a national patient identifier has come to light as a difference between two major heath care bills making their way through the House and the Senate.
The bills are linked to outrage over surprise medical bills but they have major implications over how the underlying health care costs will be controlled through competitive insurance and regulatory price-setting schemes. This Brookings comment to the Senate HELP Committee bill summarizes some of the issues.
Those in favor of a national patient identifier are mostly hospitals and data brokers, along with their suppliers. More support is discussed here. The opposition is mostly on the basis of privacyand libertarian perspective. A more general opposition discussion of the Senate bill is here.
Although obscure, national patient identifier standards can help clarify the role of government in the debate over how to reduce the unusual health care costs and disparities in the U.S. system. What follows is a brief analysis of the complexities of patient identifiers and their role relative to health records and health policy.