Hot off the presses, my colleague Janice Jones and I contributed to an article on how healthcare documentation specialists (HDS) can address problems with dictation in this month’s For the Record magazine. The tips and tools discussed in the article are great for those who process the dictation via traditional transcription or editing of speech-recognized text. However, I would like to focus this post on best practices for dictators. Good dictation habits are a huge help to the team that provides transcription and editing services, and clear, complete dictation improves efficiency and quality like nothing else. So, to avoid the “back-and forth” with healthcare documentation services and clinical documentation improvement, here are some tips to consider. Continue reading
No, I don’t have writer’s block. I just have blanks on my mind.
Healthcare documentation specialists (HDS) such as medical transcriptionists, speech recognition editors, or quality analysts insert a “blank” or “flag” into a document to indicate to a physician where information is missing, incorrect, or questionable. But are blanks good or bad? In a perfect world, the physician would dictate or directly enter the document clearly, accurately, and with all relevant details the first time around. However, life is messy, and clinicians are very busy human beings, so there are many reasons why resolution of blanks is a necessary part of the document creation process. The HDS’s eyes and ears are important to ensure the quality of the documentation. First and foremost, they look and listen for potential patient care and safety concerns, but they also ensure the patient’s story reflects the professionalism and integrity of the individuals and organizations involved in the patient’s care. Continue reading
I attended two AHIMA events this fall – the Health Information Integrity Summit back in September, and the Annual Convention and Exhibit in October. These events have prompted me to think about data governance – the people, processes, and technology that are put in place to create a framework for capturing data. My background in document creation workflows and technology makes me keenly aware of how quality issues can make or break the success of documentation processes further downstream in the cycle such as coding, analytics, and system interoperability.
I often hear the cliché “garbage in, garbage out,” being used to describe how bad content capture practices can lead to a myriad of problems when attempting to use captured data and documentation for decision making and quality improvement. Continue reading
How do you know all of the elements are in place for high-quality healthcare documentation? Where is the “sweet spot” in which the people, processes, and technology come together to deliver the optimal mix of complete, accurate, and timely content? AHDI and AHIMA have identified seven contributors to documentation quality, all of which deserve careful consideration when designing a quality assessment program.
The Author: Physicians and other clinicians affect the quality of documentation more than anyone, or anything, else. Whether dictating or entering content directly, caregivers need to organize and articulate their thoughts so that the patient’s care, and the context around it, is clearly understood by the reader. Organizations need to have training and tools available to assist dictators with developing skills that enable them to follow standards and optimize the results being generated from their speech. Some challenges, such as pronounced accents, may always be present, but many bad habits can be improved or eliminated with practice. Continue reading
Full disclosure – I’ve spent the better part of the last few weeks reading and analyzing the latest move on CMS’ part to help control observation services – the invention of the Two Midnight Rule or TMR as I like to call it. For those of you who do not spend all your time analyzing regulations, the TMR is not really a bad idea since CMS is using it to try and reduce unnecessary hours and days in observation. On the surface, they are looking to help reduce the higher beneficiary co-pays, but they are also making their auditor’s lives easier and less costly to the Trust Fund. Getting folks on the right path in the first place is much more cost effective than chasing after them to pay attention later on.
CMS held an open call on September 26, 2013 to reiterate their intent of keeping the two midnight plan. During the call, they also announced their ‘Probe and Educate’ plan, which consists of MACs auditing only cases of less than two midnights between October 1 and December 31, 2013. There will be no RAC or MAC audits of two night stays during this time. However, the OIG, ZPIC, etc., can still review any claims they deem necessary during this period – including two midnight stays.
Rather than take you down the long and winding path of how observation services got out of control, I’ll focus on what TMR means and suggest a few things you can do to survive and thrive under this new requirement. Continue reading
Donna: Hey Sue, I am getting questions regarding Clinical Documentation Improvement (CDI) staff involvement in ICD-10 implementation.
Sue: Hopefully the CDI staff will be involved in the enterprise wide ICD-10 education, allowing them to fully understand the documentation changes needed to support ICD-10 as well as what stays the same. They’ll be on the front lines educating the physicians on I-10.
Donna: That’s true, along with generating queries for additional documentation. I see them starting to query as soon as possible for some of the specificity needed to minimize the query overload that is anticipated after October 2014.
Sue: They can pick a few diagnoses to start with and then gradually add more – you know, kind of easing themselves as well as the physicians into I-10. Take for example the diagnosis of respiratory failure. In ICD-9, the diagnosis of “respiratory failure” defaulted to the code for acute respiratory failure. In ICD-10 that won’t be the case – it will default to an unspecified respiratory failure code. So, today, when the CDI staff sees a diagnosis of unspecified respiratory failure, they can work on clarifying whether it is acute, chronic, or acute-on-chronic. Oh, and now in ICD-10, respiratory failure is further specified as being “with hypoxia” or “with hypercapnia.” Continue reading
What do we want?
When do we want it?
If only it were that simple. It’s easy for an organization to say, “We have a quality program for healthcare documentation,” but what exactly does that mean? When designing or evaluating a documentation quality assessment (QA) program, there are many factors to consider. If a QA plan is comprehensive, it has the following characteristics: Continue reading
I recently read a blog post about commercializing big data in healthcare that listed some very interesting figures:
- 90 percent of the world’s data is less than two years old
- Total data collected will grow by 40% next year
- Per IBM’s estimates, 2.5 quintillion bytes of new data is generated each day (a quintillion is 1018, or 10 followed by 18 zeros)
Now that is a lot of data. Digital pieces (bits and bytes) of information, stored on servers, just waiting for someone to make sense of it and do something useful with it. When you get this much data, we get creative and call it “Big Data.” Some industries are already starting to use the Big Data that they are gathering to benefit themselves and their customers. Think of financial services, insurance and retail. Continue reading
With all of the challenges around healthcare documentation lately, it’s fun to dream of “documentation utopia.” In other words, what are the guiding principles of documentation quality that ensure every patient encounter is documented efficiently and accurately, with the appropriate detail and timeliness? The following ideas inspired the development of the existing AHDI/AHIMA best practice recommendations.
Every document is an accurate, detailed, and complete description of the patient encounter. A high-quality document will not leave blanks or inconsistencies that require queries or addenda. Physicians and other caregivers need good tools to capture the information efficiently without bogging down their workflow, cutting corners, or making errors. As the ICD-10 date looms closer, healthcare organizations should ensure that each department has the optimal content capture technology for their situation, whether with dictation/transcription, speech recognition, direct template entry, or other methods. Continue reading
There is a lot of buzz these days about the application of big data and machine learning in variety of industries, including healthcare. Statistics have always played an important part in many aspects of healthcare, from determining effective treatments and evaluating new drugs, to analyzing patient populations for risk. So if statistical analysis is already commonplace in healthcare, why all the hype about big data and machine learning? How are they any different than things we already do? I believe we’re only just beginning to catch a glimpse of how these technologies will transform our industry over the coming decade.
In the past, data for use in analysis has been hard to come by. Because the methods used for analysis worked on a sample or subset of data, it was necessary to randomly select data and to control noise and differences in that data. The acquisition and selection of said data could be time consuming and costly. Even today one might have to manually search paper records and charts to find patients meeting the criteria needed for one’s research. In recent years data is increasingly available in electronic form. This has been driven by the improvement and lower cost of technology, as well as by incentives (such as those contained in the ARRA act) that promote more widespread technology adoption. If anything, the challenge is now becoming that there is too much data to be sifted through by “traditional” statistical approaches. Data now comes from a dizzying number of sources both inside and outside of the care setting that will only continue to grow, hence the term “big data. Continue reading