Against the backdrop of a global health crisis, news that the U.S. federal government published its Final Rule on Interoperability likely went unnoticed to many delivery system leaders outside of healthcare IT. However, the federal mandate marks a very important milestone in U.S. patient health: mandating secure access, exchange, and use of electronic health information.
What once was an aspirational future state has moved to mission critical, accelerated by the COVID-19 crisis. Never has the need for data integration been so acute. At a time when clinicians, researchers, and public health officials could benefit from information to better track the spread of outbreaks, pinpoint the most vulnerable populations, and share therapeutic outcomes, the barriers created by legacy integration methods have been exposed.
Information is siloed across various databases, platforms, and organizations. Interfaces must be hard-coded each time an application or its underlying data structure changes. The technical and policy limitations built into healthcare tech infrastructure make it incredibly difficult to quickly leverage valuable data or innovate.
Data Sharing: A Top Priority for Health IT
As you might expect, data access and integration are top priorities for healthcare IT organizations. Chilmark Research’s Market Trends Report on the future of open API integration notes that unlocking more value from data accumulated across the healthcare ecosystem with be key to “clinical and financial renewal” moving forward.
But this requires a big leap forward in terms of enabling technologies. While APIs have been a commonly accepted technology for data-sharing in other industries, healthcare has been slower to adopt due to current business models, perceived privacy and security vulnerabilities, and a lack of industry-wide standards. Perhaps a bit of analysis paralysis comes into play too, as healthcare IT teams work out the best ways to establish an “open” environment.
We heard firsthand about some of these challenges from members of the College of Healthcare Information Management Executives (CHIME) who participated in a LexisNexis® Risk Solutions focus group in January. Movement toward data sharing has been a gradual process, with careful consideration to:
- Striking the right balance between interoperability and data security
- Understanding nuances of appropriate data sharing: knowing who’s requesting and for what purpose and determining what’s appropriate to access
- Ensuring, above all, patient security, privacy, and trust
Action Steps We Can Take Today
Even as the compliance dates loom ahead, let’s look at some considerations for data exchange and integrity that CIOs and organizations can take now.
You need to know how your EMR is going to meet these requirements and make sure you’re comfortable with it.
Much of the interoperability compliance is going to be delivered by the EMR system. It is vital to focus on areas where the EMR capabilities are dependent on your own local workflows, facility-specific integration, and enterprise resources.
Pay special attention to patient demographics and your ability to ensure the identity of the patient or designee who is attempting to access data. The data used to do so can very quickly become outdated as people relocate, change marital status and move through life. It’s critical to continually update patient records with the latest demographic information, not only to serve authentication activities, but also to support outreach efforts.
The second part of the mandate starts in 2022 with the mandatory patient access API.
Many providers will again rely on EMRs to do this, but that will only work to a point. It is your responsibility, not the EMR vendor’s, to establish that the person using the app is who they say they are.
Authentication processes must also assess the risk of the identity and the risk of fulfilling the transaction. At the same time, these tools must also allow legitimate users to enjoy seamless experiences across devices, platforms and applications whether online, mobile or in-person.
Many healthcare systems still use single-factor authentication in an effort to reduce user friction, but multi-factor authentication should be the industry standard.
Look beyond data in your EMR and use cases that the EMR cannot meet.
You may want to evaluate a best-in-breed interoperability middleware or other data integration capability. An EMR is a workflow system, not an interoperability engine. By developing your own capability, you will have better control and ability to support enterprise-wide requirements for API and data security / privacy beyond the mandated requirements and its narrow use cases.
Many healthcare teams believe that system interoperability hinges on the development of a National Patient Identifier.
While a universal, unique patient identifier is important to enabling interoperability, it doesn’t have to be one currently designated by the government.
This topic was a heavily debated topic among the CHIME executives in our focus group. Some organizations have attempted to use a master patient index as the common identifier, but this very quickly breaks down when you have to work across multiple patient indexes or interact with the outside world. However, there are ways to leverage a third-party data partner to create a unique identifier for each individual (without using SSN) that can be used to help cleanse and aggregate the data across systems.
The reality is that there is time to think and get this right. The reality also is, as a U.S. healthcare system, we aren’t ready. The best we can do is begin to evaluate and assess our current systems and decide where we need to plug in support and technology.