Achieving Data Liquidity . . . How GE is Advancing the State of the Art

Keith Boone

Co-author: Mark Segal

Big Data and Analytics are clearly the next wave of innovation in the technology sector, and this shift is especially important  for GE as illustrated in our work on the Industrial Internet.  Data liquidity is the essential ingredient for these powerful tools  to realize their potential.  Specifically, data captured by healthcare IT systems needs to be made readily available to an array of other services in your digital infrastructure (which may extend well outside of your own formal organization) to provide analytics and  decision support and the kinds of early intervention capabilities that can detect changes and address them in real time.

One of the challenges to achieving  data liquidity within an organization’s infrastructure is the diversity of data formats  and associated technologies needed to interconnect within and across enterprises.  Sometimes, of course, this diversity is due to the essential nature of the data.  After all, individual lab test results, waveform data such as that coming from a fetal heart monitor or an EKG, and images from a variety of modalities have essential differences that must be accounted for.  At other times, the need for diversity is not so clear, as in the case of a single laboratory result that may be sent from a lab, imported into an EHR, expressed in a CCDA document shared with  an HIE, or referenced by  a quality measure. Each of these use cases has unique demands, but the essential core of that lab test datum is the same across all the systems that have to use it.

At GE, we have been looking at ways to simplify connectivity within (and across) the healthcare enterprise; improving data access, lowering integration costs, and improving care for patients.  This simplification is essential  to support the analytic capabilities demanded by high performing accountable care organizations, truly integrated care, and the full realization of longitudinal electronic health records.

The reality is that the longitudinal electronic health record is not a feature of any single product from any single vendor.  Rather, It is an emergent property of multiple interconnected Health IT systems, supported by healthcare providers, payers, public health agencies and third parties organizations supporting Health Information Exchange.  Making these systems connect so that we can apply analytics to the information available is a big challenge.

Recently, the Office of the National Coordinator for Healthcare IT commissioned a report from JASON, an advisory body managed by The MITRE Corporation. One key recommendation in that report addresses a similar concern:

EHR software vendors should be required to develop and publish APIs for medical records data, search and indexing, semantic harmonization and vocabulary translation, and user interface applications. In addition, they should be required to demonstrate that data from their EHRs can be exchanged through the use of these APIs and used in a meaningful way by third-party software developers.
–JASON: A Robust Healthcare Infrastructure

At GE Healthcare, we would go a step further.  It won’t simply be a matter of every vendor publishing its own APIs without reference to common core standards.  Ultimately, having hundreds of non-standard APIs is not much better than having hundreds of non-interoperable Health IT systems.  We feel it would be better to agree on a common core API for use cases that would benefit from common cross-vendor APIs, as technology developers  like Microsoft, Netscape, IBM and Sun did in the late 1990s on the Document Object Model (DOM).  The DOM specification is now supported by every major web browser in use (click here to see if your browser supports it).

We would argue that another important contrast with the JASON approach is that such core APIs should not be developed or required by the federal government, directly or indirectly. Rather, they should emerge and be tested and refined by the robust standards development processes used by the healthcare industry.  APIs themselves are increasingly important to Health IT developers, including APIs designed for connection with other products and services; these must be designed and implemented to ensure security and patient privacy .  It will be important that, to ensure high performance, application developers are able to develop standards-based extension to these core APIs as needed; after all, that is how the Document Object Model advanced from its initial to its current third version. Work on such standard cross-vendor APIs is progressing and of great interest to GE.

For example, at GE Healthcare, we are investigating how emerging standards like HL7’s Fast Healthcare Interoperability Resource (FHIR) will help us develop Health IT services that can be used in our products.  Our engineers have been creating a healthcare domain layer on top of the GE Predix™ Platform called Predix for Healthcare™ to support interconnection of Health IT systems into our Centricity ™ solutions.  They have been using the HL7 FHIR® standards and working along with HL7 and other developers, partners and healthcare providers to ensure that that these standards can be the basis for the common, standards-based APIs for that enable longitudinal electronic health records and a true “Healthcare Internet.”


2 thoughts on “Achieving Data Liquidity . . . How GE is Advancing the State of the Art

  1. The drive to standardization for the benefit of many is a sound strategy. Driving maturity is a key sign of market leadership. Although there is great need, the challenges are great. What is GE doing to address the challenges that limit standards development and adoption? Does GE have the corporate mental models to succeed, and the influence to impact the industry? What is GE’s strategy to unify and harmonize the many interests across stakeholders, and establish a platform for technical standardization? Business interests must be aligned first for standards to have any level of sustainability and adoption. What formal programs and mechanisms does GE have to drive standards (within the company, across the industry, representing the full value chain)? Has GE included the “citizen patient” as a key factor to drive change? Across industry player, does GE intend to establish “data alliances” as a means of truly leveraging Big Data? I wish you the very best and success.

    • Thank you for your insightful comment and questions. We will address these and similar issues in future blog posts.

Leave a Reply

Your email address will not be published. Required fields are marked *