How Is Data Changing Healthcare?

Dipti Patel-Misra

Dipti Patel-Misra

Chief Data and Analytics Officer, MedAmerica Data Services
Joshua Tamayo-Sarver, MD, PhD

Joshua Tamayo-Sarver , MD, PhD

Vice President of Informatics

Published December 04, 2018

Healthcare data and information

Insights on the National Institutes of Health data science strategy

In June 2018, the National Institutes of Health (NIH) unveiled their new data science strategic plan, with the goal of improving the availability of biomedical data to researchers, innovators, and clinicians across the country.

How will this initiative impact patients and providers? Vituity's Chief Data and Analytics Officer Dipti Patel-Misra, PhD, MBA, and Vice President of Informatics Josh Tamayo-Sarver, MD, PhD, talk about how the plan could transform patient care and the challenges it will face.

To start us off, please tell us briefly what this new NIH strategic plan is all about.

Dipti Patel-Misra: Over the past few decades, we've seen an explosion in the amount of biomedical data created. To take just one example, the amount of genomic sequencing data has grown exponentially since the 1990s. Not only do we need to create systems that can house that data, we also need to make it accessible to researchers and stakeholders.

Ideally, digital research libraries should be organized according to the FAIR Data Principles, meaning information is findable, accessible, interoperable, and reusable. In Europe, funded researchers are incentivized to manage their data according to FAIR. But here in the U.S., we've never had national data standards.

This explains why researchers at different U.S. universities might study the same question but index their results in completely different ways. And the computer systems that house their data might not be connected to one another or to a central network. As a result, a lot of valuable data that could help patients sits unused.

The NIH strategic plan aims to create national standards and systems that promote sharing and reuse of biomedical data. This includes integrating existing data management tools, creating new tools, developing universal search algorithms, and setting metadata and indexing standards.

How could having a robust national data system transform patient care?

DPM: I believe it would allow medical science to break through new barriers. We could move into the realm of personalized medicine, where we might prescribe a different diet or medication based on a person's genetic makeup.

We could also create more effective clinical pathways. For example, most people who present to the ED with chest pain don't need to be hospitalized. But how does a physician make that call when the stakes are so high? Decision-support tools could help them decide when to discharge the patient and when to order more studies.

JTS: While clinical science may advance, I'm not sure care delivery will look all that different — at least in the short term.

There's a lot of talk about how artificial intelligence will change clinical decision-making. The problem is, providers can be quite variable in their approaches. And in many cases, medical science hasn't advanced to a point where we can conclusively recommend one decision over another. So to support decision-making, we would need a system that embraces a huge diversity of practice.

There's also a dearth of data to inform many decisions. People talk about big data in healthcare, but we're not quite there yet. To put it in perspective, the black box on a plane records more data on a cross-country flight than all of the healthcare data in the country.

In your opinion, what challenges will the NIH face in executing its plan?

DPM: Two issues come to mind. One, a lot of healthcare information is actually more language than structured data. To convert it to usable data, you need to place it within a medical taxonomy or quantify it in some way. Natural language processing has come a long way in recent years, but healthcare remains a huge challenge, because the language is so idiosyncratic.

The second challenge is managing the vast amount of information created in real time. In recent years, we've seen a proliferation of smart healthcare devices. Patients already communicate with providers using smartphone apps and internet-enabled scales, blood pressure cuffs, and glucose meters. In the near future, they may be wearing smart contact lenses, skin patches, and implantable devices. And physicians may be using a virtual "assistant" like Alexa or Siri to deliver care at the bedside.

These devices provide valuable clinical data (which is de-identified to protect patient privacy). But they also produce a lot of noise. So the question becomes, how do we isolate and store the useful information in an efficient way? This is a question that healthcare is just starting to explore, and I'm sure it will challenge NIH in coming years.

JTS: I think integrating data systems on a national level could get tricky. Everybody talks about interoperability, but I don't think there's much appreciation of the complexity involved.

Basically, interoperability is the ability of a computer or system to understand a message from another system. Interoperable systems not only speak the same language, they also share a common vocabulary.

A good example of an interoperable system is the finance industry. Banks all over the country can easily transact with one another. So why is it so much harder in healthcare? Two reasons come to mind.

First, in financial transactions, we have a unifying concept: currency. By contrast, biomedical data systems exchange DNA sequences, imaging videos, lab values, and many other things that can't be easily commoditized or quantified.

Second, the language of healthcare has a huge vocabulary. The average person knows about 45,000 words. By contrast, the average medical data system uses about a million terms.

Currently, biomedical data systems get around this problem with additional programming. You can add conditions to two systems that allow them to "translate" one another's messages. This allows them to share data, but it's not true interoperability.

How would modernizing the data system affect Vituity's ability to care for patients?

DPM: For one, having access to large, standardized, and well-managed data sets would allow us to create more powerful tools for our clients and patients. Vituity has collected quite a bit of historical data from our practices over the past few decades. Our in-house data team can use it to predict things like peak ED volumes and optimal staffing levels. But with access to data from practices around the country, we could do much more.

I also believe that Vituity's data is valuable to other organizations, especially when it comes to understanding care quality, processes, and physician behavior. By partnering with NIH's data initiative, we could help to transform care for patients everywhere.

Thanks so much to both of you for sharing. Any final thoughts?

DPM: Personally, I'm excited to see NIH taking this initiative, because I believe it could have a huge impact on patient care. As I said earlier, it's quite an ambitious plan. I'm curious to see how they will break it down into actionable steps and execute over the next couple of years.

To learn more about Vituity's data programs and tools, visit our data services page.

Originally published Dec. 4, 2018.

Partnering to improve patient lives

Vituity branding orange wave pattern background