Attributing a person to a primary care physician (PCP) is an essential feature of population health management because it enables an accurate and fair assessment of the quality of care a provider delivers. Attribution is based on the concept that a PCP is responsible to a person across time and the entire continuum of care. It establishes this responsibility, creating a relationship between a person and his or her PCP. When members have a designated PCP, plans are able to consider the overall health of a PCP’s unique panel of patients, enabling them to measure and reward provider performance on an apples-to-apples basis. Continue reading
I am known for having a “glass is half full” optimistic view of life, so when I was recently presented with the opportunity to be a panelist at a Meaningful Use discussion, I accepted. The discussion was held at the American Association for Clinical Chemistry (AACC) 2014 annual meeting in Chicago. This year’s Healthcare Forum session was held jointly with American Society for Clinical Laboratory Scientists (ASCLS). I pondered the best way to adequately portray the complexity (a.k.a. frustration and confusion) occurring across the industry as hospitals attempt to keep up with MU (not to mention all of the other federal mandates). Continue reading
Every hospital in the U.S. is being pushed to improve patient experience, health outcomes, and total costs. Not every hospital has a data analyst –let alone a team of analysts—dedicated to measuring the progress. Fortunately, the 3M Client Experience Summit provided plenty of opportunities to learn from presenters and trend-setting hospitals.
This year, for the first time, several sessions at the Summit focused on population health. Amirav Davy, senior clinical analyst at Allina Health, talked about how to “provide information that matters” in improving transitions of care.
Following his presentation, 3M met with Amirav to learn more about how Allina uses analytics to improve the delivery of healthcare. Here are some excerpts from the conversation: Continue reading
If you need a coronary artery bypass graft, India might not be the first place you’d think of to have the surgery done, but you might want to think again. A coronary bypass graft in the U. S is likely to cost $88,000 dollars. The same treatment in a JCAHO accredited hospital in India only costs $9,500. Both the U.S. and India meet world-class quality standards, so why does the surgery cost so much less in India? It’s simple: Innovation in the healthcare delivery process.
A recent Harvard Business Review article studied seven hospitals in India that are delivering world-class care at a fraction of the cost in the U.S. These hospitals are able to deliver affordable, high quality care largely because they have adopted a hub-and-spoke model for delivery. They concentrate the most expensive equipment (PET scanners, cyberknives, and cyclotrons) and specialized physicians in the Hub. In the spokes, they keep general practitioners and lower cost equipment. Patients are diagnosed and care plans are created in the hub, and the treatment is delivered in the regionalized spokes. Continue reading
Guest blog by Tiffany Harman, RN, Clinical Analyst with 3M Health Information Systems
The discussion around the potential and necessity for big data analytics has reached a turning point. Many hospitals, health plans, and accountable care organizations realize that the future of healthcare and population management is going to depend on big data and are now trying to decide how to best use it. So what exactly is big data?
Big data is a buzzword, or catchphrase, used to describe a massive volume of both structured and unstructured data that is so large that it’s difficult to process using traditional database and software techniques. Big data analytics play a crucial role in the financial success of an organization and the physical health of patients it serves. There is a lot of discussion around how we can use the data and some discussion around how we are using the data. But before we use this data to treat populations, the reliability and validity of the data cannot be ignored. Everyone is racing to apply their analytics to reduce health care cost and improve patient outcomes. Organizations must comprehend all of the details of the data they are analyzing to accurately depict every aspect of the patient care performed. Continue reading
Last week, we hosted one of our Executive Council meetings up in Park City, Utah. These meetings bring together executives from many of our client sites to discuss the challenges they’re facing and ways in which 3M Health Information Systems can help them achieve their goals. It wasn’t always easy keeping the group focused on the topics instead of the view of the snow-covered mountains through the window behind me, but when the subject of analytical needs came up, I had their attention. The discussion quickly turned to the challenges they face in getting the data needed to manage regulatory requirements, reduce costs, and improve quality of care. The client executives participating voiced a number of concerns, including:
- Difficulty in getting longitudinal data across the healthcare continuum
– Inability to get data from unstructured text within their EMR
– Limitations of claims data only
– Use of data to identify “avoidable care” so they can reduce costs and improve outcomes
– Data needed to manage compliance risk Continue reading
“Big data” is a collection of data so large that common database tools cannot easily manage it. Imagine a wilderness of datasets, endless rows and columns of data points as yet unexplored and untamed.
It sounds adventurous. Google tells me that big data can help me drive outcomes and spark innovation. With the help of advanced analytics, I can harness the digital universe and unlock big data’s hidden value. The rush of metaphors makes me dizzy.
Is it big data or big hype?
I asked Jason Mark, the master black belt for the Lean Six Sigma program at 3M Health Information Systems. “There is a lot of hype,” he said. “Big data won’t solve your problems any more than cloud computing or an EHR. But it can make you better informed and give you more information to address and improve performance.” Continue reading
Echoing Paul Cerrato’s post on EHRs and Pay For Performance, which cites a study by Jonathan Weiner, et. al., at Johns Hopkins University titled New Paradigms for Measuring Clinical Performance Using Electronic Health Records, there are many shortcomings in current EHR support for a shift to Pay For Performance (P4P). Cerrato particularly calls out statistics showing very small fractions of ambulatory care encounters fully documented in EHRs and interoperable across providers – necessary to meaningfully impact quality of care across provider organizations.
There are two issues at work here: the capture and the use of data. When two provider EHRs cannot adequately share data for a single patient, this is an issue of the usage of the data. If the data cannot be effectively used across provider organizations, there is little chance of using the EHRs to drive improved quality, and thus succeeding at P4P. Continue reading
One of life’s little pleasures is that you can travel anywhere in the world, put your bank card into an ATM machine, and withdraw money in the currency of that country. It seems magical, and it sets the standard for interoperability of data. On the other hand, one of life’s frustrations is that you can’t move from state to state, from one insurance carrier to another, or even from a hospital to the one across the street, and seamlessly access your personal health record. Ever wonder why transferring health information is so much harder?
Healthcare by its very nature is much more complex than financial transactions. When clinicians are trying to determine what to do for a particular patient, the information could come from many sources. It could come from information systems in imaging services or the laboratory, from the patient history and physical exam, or from devices such as bedside monitors. These different information sources “talk” in different terms and codes, called terminologies. Continue reading
Today, 3M HIS announced it has acquired CodeRyte. As division scientist, this is exciting news from a technology perspective. Here’s my take on what it means for NLP:
The distinctive feature of CodeRyte’s technology is its strong statistical NLP capability. As I discussed in the 3M white paper, Auto-coding and Natural Language Processing, statistical machine learning systems offer the possibility of significant accuracy improvements in data-rich environments, compared to traditional rules-based approaches. A good example is in the outpatient coding environment, where data volumes are large, and CodeRyte has a history of performing very well compared to other systems.
There are multiple ways to boost the accuracy of statistical NLP, and the combination of 3M and CodeRyte will allow us to pursue several paths to the direct benefit of end users. Continue reading