Categories

Tag: open data

Why go to Health Datapalooza? Ask Bruce Greenstein, CTO of HHS

Health Datapalooza is coming up quick at the end of April, so I sat down with Bruce Greenstein, CTO of HHS about why all of THCB’s health tech friends should attend. Plus, we get into what’s happening with the open data movement and how Bruce’s past-life at Microsoft is going to shape how he and HHS work with those consumer tech companies that are pushing harder and harder into healthcare.

Government’s New Doctor Payments Website Worthy of a Recall

Screen Shot 2014-10-01 at 3.19.19 PM

If the federal government’s new Open Payments website were a consumer product, it would be returned to the manufacturer for a full refund.

Open Payments is the government’s site for publishing payments made to doctors and teaching hospitals by drug and medical device manufacturers. It includes 4.4 million payments, worth $3.5 billion, to more than half a million doctors and almost 1,360 teaching hospitals.

In a news release announcing the site’s launch, the Centers for Medicare and Medicaid Services said the goal is “to help consumers understand the financial relationships between the health care industry, and physicians and teaching hospitals.”

Continue reading…

OpenFDA – the Good, the Bad, and the Ugly

Screen Shot 2014-07-21 at 5.59.57 PM

Adverse Event Reports Since 2004. Source: OpenFDA

On June 3rd, the FDA launched OpenFDA, in an attempt to take large internal datasets and make them more accessible and usable by the developer and business community.

OpenFDA is delivered in a search-based API that should enable software developers to more easily build applications based on adverse event data from the FDA Adverse Event Reporting System (FAERS) dataset for the period 1/1/2004 to 6/30/3013. The FDA has announced plans to add device and food adverse events data to the framework, along with structured product labeling and recall data (update: drug and device recall data was added on July 16).

The launch was heralded with the sort of buzz and hoopla usually reserved for a major product launch from a Silicon Valley startup. We have held off on any analysis and opinion until now to give our team the needed time to look through the system thoroughly.

Now, I readily admit, I am biased. As a data geek with 15+ years working on Big Data problems, I really want to love OpenFDA.  It is, after all, a major step forward both in terms of technology, but more importantly, philosophy from an agency that hasn’t exactly been a shining example of either in recent years.

In an ideal world, OpenFDA could usher in a world of new and improved tools and products that would improve patient safety and adherence, increase physician awareness of drug safety dangers, assist healthcare decision makers who are driving prescribing behavior with better decision support, and lower the overall cost of care by reducing avoidable side effects.

But we don’t live in an ideal world.

So, here are my thoughts on the Good, Bad, and Ugly of OpenFDA:

The Good:

OpenFDA is a seal of approval over the use of FAERS data in multiple settings. This is the first time that FDA has confirmed what we have believed all along – that these data are valuable and should be used in multiple venues to improve patient safety. It has long seemed ridiculous that the FDA spends millions of dollars to collect these data, uses it for their own internal safety signaling and review processes, but then deters others in the healthcare community from deploying these same data in new and innovative ways.

OpenFDA gives the long awaited ‘all-clear’ to harness the power of these data to improve care throughout the healthcare system.   We’re excited to see how this evolves in the product space – especially at the patient level – in the months and years ahead.

The Bad:

As with any new launch, once the excitement dies down, the true capabilities and limitations of the system are revealed. After careful review we’ve discovered several major concerns, two of the big ones are detailed below:

Continue reading…

Improving Clinical Document Exchange

SMART C-CDA infographic -- click to enlarge2014 will see wide-scale production and exchange of Consolidated CDA documents among healthcare providers. Indeed, live production of C-CDAs is already underway for anyone using a Meaningful Use 2014 certified EHR.

C-CDA documents fuel several aspects of meaningful use, including transitions of care and patient-facing download and transmission.

This impending deluge of documents represents a huge potential for interoperability, but it also presents substantial technical challenges.

We forecast these challenges with unusual confidence because of what we learned during the SMART C-CDA Collaborative, an eight-month project conducted with 22 EHR and HIT vendors.

Our effort included analyzing vendor C-CDA documents, scoring them with a C-CDA scorecard tool we developed, and reviewing our results through customized one-on-one sessions with 11 of the vendors.

The problems we uncovered arose for a number of reasons, including:

  • material ambiguities in the C-CDA specification
  • accidental misinterpretations of the C-CDA specification
  • lack of authoritative “best practice” examples for C-CDA generation
  • Continue reading…

The Pharmacies and Retailers Say They’re In. Is the Blue Button Initiative About to Change Everything?

The Obama administration announced significant adoption for the Blue Button in the private sector on Friday.

In a post at the White House Office of Science and Technology blog, Nick Sinai, U.S. deputy chief technology officer and Adam Dole, a Presidential Innovation Fellow at the U.S. Department of Health and Human Services, listed major pharmacies and retailers joining the Blue Button initiative, which enables people to download a personal health record in an open, machine-readable electronic format:

“These commitments from some of the Nation’s largest retail pharmacy chains and associations promise to provide a growing number of patients with easy and secure access to their own personal pharmacy prescription history and allow them to check their medication history for accuracy, access prescription lists from multiple doctors, and securely share this information with their healthcare providers,” they wrote.

“As companies move towards standard formats and the ability to securely transmit this information electronically, Americans will be able to use their pharmacy records with new innovative software applications and services that can improve medication adherence, reduce dosing errors, prevent adverse drug interactions, and save lives.”

While I referred to the Blue Button obliquely at ReadWrite almost two years ago and in many other stories, I can’t help but wish that I’d finished my feature for Radar a year ago and written up a full analytical report.

Extending access to a downloadable personal health record to millions of Americans has been an important, steady shift that has largely gone unappreciated, despite reporting like Ina Fried’s regarding veterans getting downloadable health information.

According to the Office of the National Coordinator for Health IT, “more than 5.4 million veterans have now downloaded their Blue Button data and more than 500 companies and organizations in the private-sector have pledged to support it.”

Continue reading…

A Case for Open Data

A couple of weeks ago, President Obama launched a new open data policy (pdf) for the federal government. Declaring that, “…information is a valuable asset that is multiplied when it is shared,” the Administration’s new policy empowers federal agencies to promote an environment in which shareable data are maximally and responsibly accessible. The policy supports broad access to government data in order to promote entrepreneurship, innovation, and scientific discovery.

If the White House needed an example of the power of data sharing, it could point to the Psychiatric Genomics Consortium (PGC). The PGC began in 2007 and now boasts 123,000 samples from people with a diagnosis of schizophrenia, bipolar disorder, ADHD, or autism and 80,000 controls collected by over 300 scientists from 80 institutions in 20 countries. This consortium is the largest collaboration in the history of psychiatry.

More important than the size of this mega-consortium is its success. There are perhaps three million common variants in the human genome. Amidst so much variation, it takes a large sample to find a statistically significant genetic signal associated with disease. Showing a kind of “selfish altruism,” scientists began to realize that by pooling data, combining computing efforts, and sharing ideas, they could detect the signals that had been obscured because of lack of statistical power. In 2011, with 9,000 cases, the PGC was able to identify 5 genetic variants associated with schizophrenia. In 2012, with 14,000 cases, they discovered 22 significant genetic variants. Today, with over 30,000 cases, over 100 genetic variants are significant. None of these alone are likely to be genetic causes for schizophrenia, but they define the architecture of risk and collectively could be useful for identifying the biological pathways that contribute to the illness.

We are seeing a similar culture change in neuroimaging. The Human Connectome Project is scanning 1,200 healthy volunteers with state of the art technology to define variation in the brain’s wiring. The imaging data, cognitive data, and de-identified demographic data on each volunteer are available, along with a workbench of web-based analytical tools, so that qualified researchers can obtain access and interrogate one of the largest imaging data sets anywhere. How exciting to think that a curious scientist with a good question can now explore a treasure trove of human brain imaging data—and possibly uncover an important aspect of brain organization—without ever doing a scan.

Continue reading…

Health Datapalooza Day One: How Will We Grow Data for Improving Health?

An unfathomably complex entity such as a health system grows over time like a city. Right now, communications and data usage in the US healthcare system is a bit like a medieval town, with new streets and squares popping up in unpredictable places and no clear paths between them. Growth in health information has accelerated tremendously over the past few years with the popularity of big data generally, and we are still erecting structures wherever seems convenient, without building codes.

In some cities, as growth reaches the breaking point, commissioners step in. Neighborhoods are razed, conduits are laid in the ground for electricity and plumbing, and magnificent new palaces take the place of the old slums. But our health infomation system lacks its Baron Haussmann. The only force that could seize that role–the Office ofthe National Coordinator–has been slow to impose order, even as it funds the creation of open standards. Today, however, we celebrate growth and imagine a future of ordered data.

The health data forum that started today (Health Datapalooza IV) celebrated all the achievements across government and industry in creating, using, and sharing health data.

Useful data, but not always usable

I came here asking two essential questions of people I met: “What data sources do you find most useful now?” and “What data is missing that you wish you had?” The answer to first can be found at a wonderful Health Data All-Stars site maintained by the Health Data Consortium,which is running the palooza.

The choices on this site include a lot of data from the Department of Health and Human Services, also available on their ground-breaking HealthData.gov site, but also a number of data sets from other places. The advantage of the All-Stars site is that it features just a few (fifty) sites that got high marks from a survey conducted among a wide range of data users, including government agencies, research facilities, and health care advocates. Continue reading…

Universal EHR? No. Universal Data Access? Yes.

A recent blog posting calls for a “universal EMR” for the entire healthcare system. The author provides an example and correctly laments how lack of access to the complete data about a patient impedes optimal clinical care. I would add that quality improvement, clinical research, and public health are impeded by this situation as well.

However, I do not agree that a “universal EMR” is the best way to solve this problem. Instead, I would advocate that we need universal access to underlying clinical data, from which many different types of electronic health records (EHRs), personal health records (PHRs), and other applications can emerge.

What we really need for optimal use of health information is not an application but a platform. This notion has been advanced by many, perhaps most eloquently by Drs. Kenneth Mandl and Isaac Kohane of Boston Children’s Hospital [1,2]. Their work is being manifested in the SMART platform that is being funded by an ONC SHARP Award.

Continue reading…

Open Data Advocate Joins Patient Privacy Rights Group as Chief Technology Officer

The small news is that I formally joined Patient Privacy Rights as chief technology officer. I have been an extreme advocate for open data for years. For example, I’m  a card-carrying member of the Personal Genome Project where I volunteer to post both my genome and most of my medical record. PPR, on the other hand, is well known for publicizing the harms of personal data releases. These two seemingly contradictory perspectives represent the matter-antimatter pair that can power the long march to health reform.

The value of personal medical data is what drives the world of healthcare and the key to health reform. The World Economic Forum says: “Personal data is becoming a new economic “asset class”, a valuable resource for the 21st century that will touch all aspects of society.” This “asset” is sought and cherished by institutions of all sorts. Massive health care organizations, research universities, pharmaceutical companies, and both state and federal regulators are eager to accumulate as much personal medical data as they can get and to invest their asset for maximum financial return. Are patient privacy rights just sand in the gears of progress?

Continue reading…

ONC Holds A Key To the Structural Deficit

It’s called Blue Button+ and it works by giving physicians and patients the power to drive change.

The US deficit is driven primarily by healthcare pricing and unwarranted care. Social Security and Medicare cuts contemplated by the Obama administration will hurt the most vulnerable while doing little to address the fundamental issue of excessive institutional pricing and utilization leverage. Bending the cost curve requires both changing physicians incentives and providing them with the tools. This post is about technology that can actually bend the cost curve by letting the doctor refer, and the patient seek care, anywhere.

The bedrock of institutional pricing leverage is institutional control of information technology. Our lack of price and quality transparency and the frustrating lack of interoperability are not an accident. They are the carefully engineered result of a bargain between the highly consolidated electronic health records (EHR) industry and their powerful institutional customers that control regional pricing. Pricing leverage comes from vendor and institutional lock-in. Region by region, decades of institutional consolidation, tax-advantaged, employer-paid insurance and political sophistication have made the costliest providers the most powerful.

Continue reading…

Registration

Forgotten Password?