Category Archives: Data Translation

standard language; clinical terminology; medical coding; interoperability of medical information; federal grant application; clinical research; quality care; quality outcomes; health information exchange; HIEs; health data dictionary; HDD; health data analytics; health survey aggregation; evaluation of health information; needs assessment; surveillance activities; performance monitoring; accurate reporting

Achieving Semantic Interoperability: Point-to-Point Versus Centralized Mapping

Our philosophy at 3M is to approach terminology mapping and semantic interoperability using a centralized terminology server. With a centralized source of terminology management and maintenance, each data source needs to be mapped only once. Once this single mapping occurs, all the other systems that are mapped to the centralized server can leverage the mappings so data can be translated and exchanged without losing meaning. Therefore, for n systems that need to be mapped, only n mappings need to be performed.

On the other hand, in a point-to-point mapping approach, each system is mapped directly to every other system. While this is a feasible approach when dealing with a few systems, it becomes unwieldy as the number of systems increases. For example, given three systems to map, the total number of mappings that need to be created is three. However, if we increase the number of systems to five, the point-to-point mappings increase to 10. This is illustrated in Figure 1. Continue reading

All-Payer Claims Databases (APCDs) and More: Key Takeaways from the NAHDO Annual Conference

The 2014 National Association of Health Data Organizations (NAHDO) 29th Annual Conference and APCD Workshop was recently held in downtown San Diego. With over 200 attendees and speakers from government, research and healthcare institutions, the event explored the current challenges and discoveries related to healthcare data and reform. Here are some key takeaways from the three-day event:

1. APCDs are not going away.

This is the eighth year the APCD Workshop has been added to the NAHDO Conference. Its increasing prominence testifies to the importance of APCDs in healthcare market analysis, policy-making and consumer reporting. They continue to grow in number and variety—11 are now live and more are in development. Additionally, more vendors continue to enter this space—proof of the rising demand for APCDs. Continue reading

What Can Go Wrong with PCP Attribution and How It Can Be Prevented

Attributing a person to a primary care physician (PCP) is an essential feature of population health management because it enables an accurate and fair assessment of the quality of care a provider delivers. Attribution is based on the concept that a PCP is responsible to a person across time and the entire continuum of care. It establishes this responsibility, creating a relationship between a person and his or her PCP. When members have a designated PCP, plans are able to consider the overall health of a PCP’s unique panel of patients, enabling them to measure and reward provider performance on an apples-to-apples basis. Continue reading

Meaningful Use and Laboratory Data Management: Getting to the Next Level

I am known for having a “glass is half full” optimistic view of life, so when I was recently presented with the opportunity to be a panelist at a Meaningful Use discussion, I accepted. The discussion was held at the American Association for Clinical Chemistry (AACC) 2014 annual meeting in Chicago. This year’s Healthcare Forum session was held jointly with American Society for Clinical Laboratory Scientists (ASCLS). I pondered the best way to adequately portray the complexity (a.k.a. frustration and confusion) occurring across the industry as hospitals attempt to keep up with MU (not to mention all of the other federal mandates). Continue reading

What Every Hospital Ought to Know About Measuring Patient Quality, Cost, and Experience

Every hospital in the U.S. is being pushed to improve patient experience, health outcomes, and total costs. Not every hospital has a data analyst –let alone a team of analysts—dedicated to measuring the progress. Fortunately, the 3M Client Experience Summit provided plenty of opportunities to learn from presenters and trend-setting hospitals.

This year, for the first time, several sessions at the Summit focused on population health. Amirav Davy, senior clinical analyst at Allina Health, talked about how to “provide information that matters” in improving transitions of care.

Following his presentation, 3M met with Amirav to learn more about how Allina uses analytics to improve the delivery of healthcare. Here are some excerpts from the conversation: Continue reading

Will Population Health Management Drive Innovation? It Depends on the Data

If you need a coronary artery bypass graft, India might not be the first place you’d think of to have the surgery done, but you might want to think again. A coronary bypass graft in the U. S is likely to cost $88,000 dollars. The same treatment in a JCAHO accredited hospital in India only costs $9,500. Both the U.S. and India meet world-class quality standards, so why does the surgery cost so much less in India? It’s simple: Innovation in the healthcare delivery process.

A recent Harvard Business Review article studied seven hospitals in India that are delivering world-class care at a fraction of the cost in the U.S. These hospitals are able to deliver affordable, high quality care largely because they have adopted a hub-and-spoke model for delivery. They concentrate the most expensive equipment (PET scanners, cyberknives, and cyclotrons) and specialized physicians in the Hub. In the spokes, they keep general practitioners and lower cost equipment. Patients are diagnosed and care plans are created in the hub, and the treatment is delivered in the regionalized spokes. Continue reading

So What’s the Big Deal with Big Data?

Tiffany Harman 1Guest blog by Tiffany Harman, RN, Clinical Analyst with 3M Health Information Systems

The discussion around the potential and necessity for big data analytics has reached a turning point. Many hospitals, health plans, and accountable care organizations realize that the future of healthcare and population management is going to depend on big data and are now trying to decide how to best use it. So what exactly is big data?

Big data is a buzzword, or catchphrase, used to describe a massive volume of both structured and unstructured data that is so large that it’s difficult to process using traditional database and software techniques. Big data analytics play a crucial role in the financial success of an organization and the physical health of patients it serves. There is a lot of discussion around how we can use the data and some discussion around how we are using the data. But before we use this data to treat populations, the reliability and validity of the data cannot be ignored. Everyone is racing to apply their analytics to reduce health care cost and improve patient outcomes.  Organizations must comprehend all of the details of  the data they are analyzing to accurately depict every aspect of the patient care performed.  Continue reading

Healthcare Executives Sound Off on Data Challenges

Last week, we hosted one of our Executive Council meetings up in Park City, Utah. These meetings bring together executives from many of our client sites to discuss the challenges they’re facing and ways in which 3M Health Information Systems can help them achieve their goals. It wasn’t always easy keeping the group focused on the topics instead of the view of the snow-covered mountains through the window behind me, but when the subject of analytical needs came up, I had their attention. The discussion quickly turned to the challenges they face in getting the data needed to manage regulatory requirements, reduce costs, and improve quality of care. The client executives participating voiced a number of concerns, including:

- Difficulty in getting longitudinal data across the healthcare continuum
– Inability to get data from unstructured text within their EMR
– Limitations of claims data only
– Use of data to identify “avoidable care” so they can reduce costs and improve outcomes
– Data needed to manage compliance risk Continue reading

Big Data or Big Hype?

“Big data” is a collection of data so large that common database tools cannot easily manage it. Imagine a wilderness of datasets, endless rows and columns of data points as yet unexplored and untamed.

It sounds adventurous. Google tells me that big data can help me drive outcomes and spark innovation. With the help of advanced analytics, I can harness the digital universe and unlock big data’s hidden value. The rush of metaphors makes me dizzy.

Is it big data or big hype?

I asked Jason Mark, the master black belt for the Lean Six Sigma program at 3M Health Information Systems. “There is a lot of hype,” he said. “Big data won’t solve your problems any more than cloud computing or an EHR. But it can make you better informed and give you more information to address and improve performance.” Continue reading

EHRs and Pay for Performance

Echoing Paul Cerrato’s post on EHRs and Pay For Performance, which cites a study by Jonathan Weiner, et. al., at Johns Hopkins University titled New Paradigms for Measuring Clinical Performance Using Electronic Health Records, there are many shortcomings in current EHR support for a shift to Pay For Performance (P4P). Cerrato particularly calls out statistics showing very small fractions of ambulatory care encounters fully documented in EHRs and interoperable across providers – necessary to meaningfully impact quality of care across provider organizations.

There are two issues at work here: the capture and the use of data. When two provider EHRs cannot adequately share data for a single patient, this is an issue of the usage of the data. If the data cannot be effectively used across provider organizations, there is little chance of using the EHRs to drive improved quality, and thus succeeding at P4P. Continue reading