Recently, I boarded a Delta Boeing 757 plane. The plane held 180 passengers with a cruising speed of 517 mph. My flight to Portland, Oregon lasted just under two hours, of which I slept for more than an hour. By contrast, Wilbur Wright covered 852 feet in 59 seconds on that day in 1903 when the Wright brothers completed the first four sustained flights with a powered, controlled airplane.
It’s hard for the modern day traveler to imagine that the airplane and way of life we understand today wasn’t always the case. What enabled the evolution of the aviation industry from the first recognized flight of the Wright brothers to the kind of aviation travel we have today? Continue reading
Reading the title, you are probably wondering how the two go together. In relation to big data, both terms produce some of the largest challenges we experience with the efficacy of data within the realm of healthcare big data analytics. These pains, explained later, can be addressed early in building useful, large datasets. Understanding them may be the key difference in managing a successful “big” dataset versus another collection of useless binary. Continue reading
Guest blog by Michael Totzke, 3M data analyst.
At the AMIA 2014 convention in Washington D.C., we showcased some of our processes for mapping and maintaining RxNorm drugs into the Healthcare Data Dictionary (HDD). Our poster and podium presentations emphasized the fact that with clinical data, accurate and consistent mapping of terminology standards over successive versions is critical. With the selection of RxNorm as the drug terminology standard required to meet Meaningful Use criteria, it has become necessary for the HDD to maintain RxNorm’s drug data from a longitudinal perspective. Our former process for maintaining RxNorm dealt solely with the mapping of the current version, with limited regard to managing changes in RxNorm’s data over time. However, it’s not just the initial mapping that is important; having a long term strategy for maintaining that terminology within a larger terminology server is crucial for ensuring data quality. Medical terminologies change over time, and there is no algorithm yet that can alone guarantee the level of accuracy required for exchange of clinical data. Continue reading
It has been impossible to ignore the number of data breaches in the news lately. Whether you have been directly affected or not, one thing is certain: data security should be a top priority, especially in the healthcare industry. This eye-opening article talks about the new focus of hackers: targeting healthcare data. Healthcare data is extremely valuable when compared to the current value of Social Security numbers or credit card numbers on the black market. John Halamka, CIO at Beth Israel Deaconess Medical Center in Boston, sums it up:
While a stolen Social Security number might sell for 25 cents in the underground market, and a credit card number might fetch $1, “A comprehensive medical record for me to get free surgery might be $1,000,” Halamka says. “It is a commodity that is hot on the black Internet [market].” Continue reading
Our philosophy at 3M is to approach terminology mapping and semantic interoperability using a centralized terminology server. With a centralized source of terminology management and maintenance, each data source needs to be mapped only once. Once this single mapping occurs, all the other systems that are mapped to the centralized server can leverage the mappings so data can be translated and exchanged without losing meaning. Therefore, for n systems that need to be mapped, only n mappings need to be performed.
On the other hand, in a point-to-point mapping approach, each system is mapped directly to every other system. While this is a feasible approach when dealing with a few systems, it becomes unwieldy as the number of systems increases. For example, given three systems to map, the total number of mappings that need to be created is three. However, if we increase the number of systems to five, the point-to-point mappings increase to 10. This is illustrated in Figure 1. Continue reading
The 2014 National Association of Health Data Organizations (NAHDO) 29th Annual Conference and APCD Workshop was recently held in downtown San Diego. With over 200 attendees and speakers from government, research and healthcare institutions, the event explored the current challenges and discoveries related to healthcare data and reform. Here are some key takeaways from the three-day event:
1. APCDs are not going away.
This is the eighth year the APCD Workshop has been added to the NAHDO Conference. Its increasing prominence testifies to the importance of APCDs in healthcare market analysis, policy-making and consumer reporting. They continue to grow in number and variety—11 are now live and more are in development. Additionally, more vendors continue to enter this space—proof of the rising demand for APCDs. Continue reading
Attributing a person to a primary care physician (PCP) is an essential feature of population health management because it enables an accurate and fair assessment of the quality of care a provider delivers. Attribution is based on the concept that a PCP is responsible to a person across time and the entire continuum of care. It establishes this responsibility, creating a relationship between a person and his or her PCP. When members have a designated PCP, plans are able to consider the overall health of a PCP’s unique panel of patients, enabling them to measure and reward provider performance on an apples-to-apples basis. Continue reading
I am known for having a “glass is half full” optimistic view of life, so when I was recently presented with the opportunity to be a panelist at a Meaningful Use discussion, I accepted. The discussion was held at the American Association for Clinical Chemistry (AACC) 2014 annual meeting in Chicago. This year’s Healthcare Forum session was held jointly with American Society for Clinical Laboratory Scientists (ASCLS). I pondered the best way to adequately portray the complexity (a.k.a. frustration and confusion) occurring across the industry as hospitals attempt to keep up with MU (not to mention all of the other federal mandates). Continue reading
Every hospital in the U.S. is being pushed to improve patient experience, health outcomes, and total costs. Not every hospital has a data analyst –let alone a team of analysts—dedicated to measuring the progress. Fortunately, the 3M Client Experience Summit provided plenty of opportunities to learn from presenters and trend-setting hospitals.
This year, for the first time, several sessions at the Summit focused on population health. Amirav Davy, senior clinical analyst at Allina Health, talked about how to “provide information that matters” in improving transitions of care.
Following his presentation, 3M met with Amirav to learn more about how Allina uses analytics to improve the delivery of healthcare. Here are some excerpts from the conversation: Continue reading
If you need a coronary artery bypass graft, India might not be the first place you’d think of to have the surgery done, but you might want to think again. A coronary bypass graft in the U. S is likely to cost $88,000 dollars. The same treatment in a JCAHO accredited hospital in India only costs $9,500. Both the U.S. and India meet world-class quality standards, so why does the surgery cost so much less in India? It’s simple: Innovation in the healthcare delivery process.
A recent Harvard Business Review article studied seven hospitals in India that are delivering world-class care at a fraction of the cost in the U.S. These hospitals are able to deliver affordable, high quality care largely because they have adopted a hub-and-spoke model for delivery. They concentrate the most expensive equipment (PET scanners, cyberknives, and cyclotrons) and specialized physicians in the Hub. In the spokes, they keep general practitioners and lower cost equipment. Patients are diagnosed and care plans are created in the hub, and the treatment is delivered in the regionalized spokes. Continue reading