Toward High-Value Care: A Quality Improvement Initiative to Reduce Unnecessary Repeat Complete Blood Counts and Basic Metabolic Panels on a Pediatric Hospitalist Service
OBJECTIVE: Achieving high-value health care is a goal of health care providers who strive to increase quality and decrease cost. Decreasing laboratory tests is a potential method to increase value. We used quality improvement methodology to decrease the percentage of unnecessary complete blood counts (CBCs) and basic metabolic panels (BMPs) obtained on a pediatric hospital medicine service from 13.5% to <5%.
METHODS: A pre- and postintervention design was conducted including all patients admitted to 2 hospital medicine teams between May 2013 and December 2014. Multiple interventions linked to key drivers were tested through rapid plan-do-study-act cycles. Primary and secondary outcome measures, percent reduction of unnecessary CBCs and BMPs, and consecutive day tests were analyzed using statistical process control. Total billed charges, laboratory charges, 7-day readmission rates, and length of stay were compared pre- and postintervention.
RESULTS: Primary outcome of unnecessary CBCs and BMPs was reduced from a baseline of 13.5% to 4.5%. Secondary outcome measure of consecutive day testing was reduced from 20.9% to 8.5%. Median laboratory charges decreased significantly ($842 [$256–$1863] vs $800 [$222–$1616], P = .002), with no significant differences in total billed charges, 7-day readmission rates, or length of stay.
CONCLUSIONS: Rapid cycle plan-do-study-act methodology, initially focusing on the inclusion of a daily laboratory plan in progress notes, was an effective means to improve laboratory utilization and decrease laboratory charges without adversely affecting other quality measures. Spreading these efforts to different patient populations and laboratory tests could have a demonstrable effect on the value of health care.
The turn of the new millennium brought about a rejuvenated interest in patient safety and quality,1,2 and more recently, attention has broadened to improve value in health care as insurance payers, including the Centers for Medicare and Medicaid Services, have begun to base hospital payments on care value.3 Unnecessary services and waste in health care have been identified as 2 potential sources that decrease value, with waste constituting 21% to 47% of health care expenditures.4,5 If value can be defined as quality or outcomes/cost,4,6 then targeting waste reduction in general, and laboratory testing in particular, has the potential to increase value.
Repeat laboratory testing has untoward effects on care quality. Previous literature has not only shown that laboratory testing is overused7 but also that many tests are repeated within a time frame that provides little clinical information.8,9 Repeat tests are taxing on laboratory personnel’s workload and delay efficient reporting of other studies.10 Excessive phlebotomy can result in iatrogenic anemia in both neonates11 and adults.12 Spurious or unexpected results have downstream implications for further testing, increased resource utilization, and overdiagnosis.13–15 Excessive phlebotomy is associated with patient discomfort and sleep disruption and can lead to decreased patient satisfaction.
Because of these issues, multiple projects have sought to reduce repeat testing. Many of these studies have used single information technology interventions including computerized reminders or hard stops regarding duplicate orders,16,17 “unbundling” of panels so physicians are required to order each desired component,18,19 providing justification for each test,20 and computerized decision-support tools.21 Other reports have shown mixed success in providing charge data either while ordering tests,22–24 or on a weekly basis for each patient.25 Aside from 1 study reporting slightly higher unscheduled outpatient follow-up after emergency department visits,23 these studies did not report any patient safety concerns, indicating that the reduction in laboratory tests in these studies was likely from reducing excessive testing.
We set forth to decrease the number of unnecessary complete blood counts (CBCs) and basic metabolic panels (BMPs) on 2 pediatric hospital medicine (HM) teams using rapid cycle plan-do-study-act (PDSA) quality improvement methodology. We hypothesized that a successful initiative would improve value by eliminating waste and improving patient experience. To our knowledge, this is the first report reducing laboratory utilization through multiple interventions targeting provider behavior.
Monroe Carell Jr. Children’s Hospital at Vanderbilt is a large, freestanding, academic children’s hospital with 271 beds and 18 081 discharges in fiscal year 2014, with 2780 from the HM service. There are 2 HM teams, each staffed weekly by 1 of 10 HM faculty physicians, 2 pediatric or medicine-pediatric residents, 4 or 5 interns, and medical students. An attending HM physician is in-house daily to staff admissions until midnight. Residents and interns rotate on a 4-week schedule. The HM service cares for a variety of general pediatric issues and for many subspecialty patients with consultant input. This study was approved by the Vanderbilt University Institutional Review Board.
Planning the Intervention
A multidisciplinary team consisting of HM faculty, pediatric residents, laboratory faculty members, nursing leadership, and a quality consultant mapped the laboratory test ordering process,26 conducted a modified failure mode and effects analysis, and developed a key driver diagram related to theories for improvement and potential interventions (Fig 1).
The team worked together to define an “unnecessary” test. The team determined that a BMP obtained within 1 calendar day of a previous BMP with ≤3 abnormal values and a CBC obtained within 1 calendar day of a previous CBC with ≤1 abnormal value were “unnecessary” for this improvement project. The BMP included sodium, potassium, chloride, bicarbonate, blood urea nitrogen, creatinine, and glucose. The CBC included white blood cell count, hemoglobin, and platelets. Normal age-specific reference ranges for BMPs and CBCs were obtained from the CALIPER study27 and Nelson Textbook of Pediatrics,28 respectively.
The interventions focused on 3 key drivers: (1) effective communication among the primary team, (2) knowledge of laboratory charges, and (3) providers understanding the magnitude of the problem. The team used a series of rapid-cycle PDSA interventions to determine their effectiveness.26
Effective Communication Among the Primary Team Members
Before the intervention, there was no standard practice of incorporating laboratory plans or interpretation into the notes or the handover tools used by residents. It was done on a case-by-case basis and anecdotally almost never done if there were no laboratories planned. We began by asking 1 intern to document both laboratory result interpretation and anticipated laboratory plans for the next 24 hours on a single patient, including if no laboratory tests were planned. Feedback revealed documenting the previous day’s laboratory interpretation was arduous because results were autoimported into the notes, but including a laboratory plan for the next day did not disrupt intern workflow. Over 2 weeks, the initiative was slowly expanded until all 10 interns on the 2 HM teams were including laboratory plans in their daily notes. This plan carried over from month to month as residents and interns switched rotations, but occasionally, a member of the improvement project team would remind the residents to include a laboratory plan. Feedback early in the project via REDcap29 also revealed that the most helpful aspect of the laboratory plan was when documentation confirmed no laboratories were needed. This allowed the covering night intern to field questions from nursing and families overnight that they had previously been unable to answer and often resulted in more testing.
For the first 2 months after the initial intervention, the majority of our unnecessary laboratories were obtained within 1 calendar day of patient admission. HM faculty revealed that future laboratory studies were only discussed when they thought studies were needed. If studies were not discussed, the faculty’s assumption was that no laboratory studies were needed, but the residents’ understanding sometimes differed. This lack of clarity resulted in laboratory studies often being ordered the morning after admission. Therefore, HM attending physicians were asked to initiate a conversation specifically addressing future laboratory testing during the initial admission conversations.
Knowledge of Laboratory Charges
A common misconception of charges at our hospital was that a panel component carried the same charge as a complete panel. Hospital administration provided the improvement team with the charges for BMPs, CBCs, and the individual components of each panel and provided permission to use them as educational materials for this study. The charges were placed on small index cards and taped to computers in HM resident ward work areas. After receiving favorable feedback about the location, size, usability, and effect on ordering habits, they were also posted in the main resident workroom.
Providers Understanding the Magnitude of the Problem
In an effort to gain support from the residents and attending physicians and to illustrate the magnitude of the problem, the team began posting updated control charts in early February in the resident work areas. Shortly thereafter, quality improvement team members began e-mailing each attending about the unnecessary laboratories obtained during their service week to inquire about the rationale for the tests. This intervention lasted 1 month mainly because of its labor-intensive nature. Feedback from the attending physicians revealed these laboratory tests were often ordered without the attending’s explicit knowledge. In late July 2014, data collection was more timely because of the creation of a weekly report, and we began e-mailing the HM teams the number of unnecessary tests ordered from the previous week, a control chart, and a short reminder to include a laboratory plan in the notes.
Laboratory data were collected from May 2013 through December 2014 via billing information. All CBC and BMP laboratory draws obtained by teams led by a HM attending were extracted. Laboratories drawn immediately before admission in the emergency department or in the ICU within 1 calendar day of transfer to the HM service were included as initial laboratories. Data regarding laboratories drawn at outside hospitals before transfer were not available for review. All repeat laboratories within 1 calendar day were reviewed to determine if they met “unnecessary” criteria. All repeat laboratory tests drawn within 1 calendar day were a secondary measure. A process measure of laboratory plans in intern notes was accomplished by doing a random review of 20 charts per week from billing data to determine if there was a written future laboratory plan. Administrative data from the Pediatric Health Information System (PHIS) was obtained to compare a 12-month preintervention period (December 2012–November 2013) with the postintervention period (December 2013–November 2014). PHIS data were merged with administrative data to capture all patients discharged from the HM service within the study period. Charges and length of stay (LOS) balancing measure were analyzed using this data. Our second balancing measure, 7-day readmission rates, was obtained using our readmission dashboard.
Analysis of primary and secondary outcome measures was performed using statistical process control p-charts with 8 points above or below the mean representing special cause variation. Seven-day readmission rates were compared in the pre- and postintervention time periods using 2-tailed t tests. Median LOS, total billed charges, and laboratory charges were compared in the pre- and postperiod with a Wilcoxon rank sum test using Stata 13.0 software. Hospitalizations with total billed charges less than the fifth and greater than the 95th percentiles were excluded from the charge and LOS analyses because those less than the fifth percentile were thought to represent short admissions with few opportunities for repeat laboratory testing and those with total billed charges greater than the 95th percentile likely spent the vast majority of time in the ICU or on surgical services where the HM service contributed little to the overall charges.
The primary outcome measure, percentage of unnecessary BMPs and CBCs ordered, decreased from 13.5% at baseline to 4.5% (Fig 2). This outcome was sustained for 42 weeks and the control limits narrowed, indicating a more stable panel ordering process on the HM service. The largest improvement in our primary outcome measure was temporally related to the laboratory plan intervention, decreasing unnecessary testing from 13.5% to 7.9%. Additionally, the sharing of the data with residents and attending physicians and asking for direct feedback about failures also resulted in a second shift of the mean line to 4.5%, allowing us to attain our goal.
We also saw an improvement in our secondary outcome, which was a reduction in the percentage of tests repeated within 1 calendar day. Through the intervention period, the mean percentage of repeat tests decreased initially from a baseline mean of 20.9% to 13.2%, and then again to 8.5%. Similar to our primary outcome, the original change also occurred in temporal relation to the laboratory plan intervention (Fig 3).
The original observation of special cause variation in our primary and secondary outcomes coincides with special cause variation in our process measure after inclusion of a laboratory plan in daily notes. The percentage of notes including a plan originally increased from a median of 30% to 80%, but degradation of the process occurred in mid-May, and the median fell to 50%. After the implementation of the final intervention that included a reminder e-mail, this improved to our goal of 75% (Fig 4).
Exclusion of the top and bottom 5% of total billed charges resulted in 2926 and 2922 patient discharges for 12 months pre- and postimplementation, respectively. Comparisons pre- and postintervention revealed a significant difference in total laboratory charges ($842, interquartile range [IQR] $256–$1863 vs $800 IQR $222–$1616, P = .002) but not total billed charges ($9822, IQR $6258–$17 265 vs $10 101 IQR $6578–$17351, P = .12).
LOS (1.56 [days], IQR 0.88–2.59 vs 1.59 IQR 0.88–2.60, P = .68) and 7-day readmission rates (3.4% vs 2.5%, P = .20) remained unchanged as balancing measures.
We used improvement science to implement multiple interventions that together achieved our primary goal of reducing unnecessary CBCs and BMPs while also reducing consecutive-day laboratory tests and total laboratory charges. The improvements in our outcomes had no adverse effects on LOS or readmissions. We focused on interventions to have an impact on 3 key drivers: effective laboratory communication among the primary team, knowledge of laboratory charges, and providers understanding the magnitude of the problem.
Previous successful attempts to reduce laboratory tests have relied mainly on single interventions. Many of these focused on order entry process, either through alterations in the ordering process16–21 or providing charge data in electronic ordering systems.23,24 Although feedback to teams regarding charges25 and educational initiatives30 have decreased costs, we believe our report is the first to use improvement science to combine multiple interventions that have successfully improved and sustained physician ordering behavior without a technology support emphasis.
While developing this project, our team had to define “unnecessary” laboratories to investigate failures and provide feedback. This definition helped us determine the vast majority of our failures were occurring within 1 day of hospital admission, allowing us to tailor our next intervention to the admission process. The definition also encouraged providers to be more focused in their requisition of follow-up tests, only ordering necessary components of a panel. This helps avoid spurious values not initially of clinical interest that subsequently drive further testing and resource utilization. Through our approach, we not only reduced laboratory tests that met our definition of unnecessary but also reduced the percentage of consecutive day CBCs and BMPs, which is currently a Choosing Wisely Campaign initiative for adult hospital medicine.31
Our team also had to define a goal. We chose a reduction to 5% to be both aggressive in our primary aim and account for clinical scenarios in which a blanket definition may not address clinical nuance. We felt that a goal of zero unnecessary tests would not have been feasible for multiple reasons. We understood that our definition of unnecessary would not fit every clinical scenario. For example, serial laboratory monitoring may be necessary in a child at risk for hemolysis or a child with metabolic disease at risk for severe electrolyte abnormalities even with minor illness. In these cases initial laboratory tests may be normal, but repeat monitoring is necessary. In fact, a cluster of such patients contributed to the special cause variation seen in 3 of the 4 weeks between May 12, 2014, and June 2, 2014. Conversely, a profoundly abnormal BMP in a child with stable contraction alkalosis on a diuretic regimen may not warrant repeating despite >3 abnormal values. For such testing that may fall outside of the “unnecessary” definition but still provide little clinical utility, our secondary outcome was developed to measure broader repeat laboratory ordering behaviors.
Throughout our interventions, feedback from key stakeholders proved pivotal. Feedback allowed us to adapt our initial intervention to include only a future laboratory plan, which helped gain support from the interns, solidified our choice of placement for laboratory charges, focused the team laboratory discussions on admission, and helped attending physicians become more aware of the number of tests being ordered during their time on service. Even interventions that were short and aborted, like the direct attending feedback, were important in improving the teams’ understanding of the problems.
Initially, we saw significant improvement in primary and secondary outcomes after PDSA cycles targeted documenting laboratory plans. As the project progressed, we saw degradation in the process before returning to our goal, despite sustained improvements in outcomes. This may reveal that the laboratory plan process was sufficient but not necessary to drive change and that the cumulative effect of interventions began to change the culture of laboratory ordering.
This study was performed at a single site. Given the known heterogeneity of HM services, our interventions may not be generalizable to other hospitals or patient populations.
We were also unable to measure many of the potential benefits to the patients, their families, and the hospital as a whole. Ideally, refraining from laboratory testing reduces the number of isolated phlebotomy draws, increases sleep for patients and families, increases patient satisfaction, and improves laboratory turn-around time. Unfortunately, within our system, we did not have a way to develop obtainable metrics for these important suspected benefits. In addition, refraining from a BMP or CBC does not necessarily mean that no laboratories were obtained because individual panel components or other tests might be ordered.
Our analysis of charges simply compared charges pre- and postimplementation. We did not take into account the effects of inflation or any potential chargemaster changes. In addition, the data extracted from PHIS included all patients discharged from the HM service, regardless of the amount of time the HM service cared for that patient. Therefore, the charges or LOS for some patients may be more attributable to different services depending on time spent in the ICU or other factors. Total billed charges likely did not change given the fixed costs associated with hospital stays32 and the relatively small contribution of these panels to the total billed charge.
Almost all of the sustainability plans for this project were done manually, but the level of reliability could be greatly increased with information technology interventions. Alterations to the progress note to automatically include a section for laboratory plans, inclusion of laboratory charges in the computerized physician order entry system instead of note cards, and automated methods for feedback may make this less labor-intensive, and the technology could potentially be shared among institutions.
Finally, our definition of “unnecessary” is not a validated metric. Although some studies have set forth to determine the clinical necessity of laboratory testing,33 clinical nuance likely prohibits an overarching definition such as the one we developed. The metric also would not be appropriate for different patient populations. This represents an area for future study.
Through a quality improvement initiative with multiple interventions, we were able to successfully reduce and sustain the percentage of unnecessary BMPs and CBCs on our HM service and decrease total laboratory charges per patient while seeing no change in our balancing measures. We propose that we have improved the value of care provided for these patients by demonstrably reducing laboratory charges and, although not directly measured, potentially improving the patient experience. Further research is needed to determine if this approach is generalizable to other settings, patient populations, and laboratory or radiographic testing, especially in resource-intensive settings such as the ICU. Such efforts broadened to other areas of resource utilization may begin to bend the value curve by providing higher quality care at lower cost.
Surveys within this study were developed and administered by using REDCap database, which is supported by UL1 TR000445 from National Center for Advancing Translational Sciences/National Institutes of Health. David Johnson received multiple thoughts and suggestions regarding this project from his classmates and faculty in the Intermediate Implementation Science Series (I2S2) given through Cincinnati Children’s Hospital. Thanks to Dr Derek Williams for assisting with the data analysis for LOS and charges, Dr Jim Gay for helping extract the PHIS data, and Travis Harper for creating the data dashboard.
Dr Johnson conceptualized and designed the study and drafted the initial manuscript; Drs Lind, Parker, Beuschel, VanVliet, Nichols, and Rauch assisted with data collection and analysis and critically reviewed the manuscript; Ms Lee and Dr Muething assisted with conceptualizing the study and critically reviewed the manuscript; and all authors approved the final manuscript as submitted.
FINANCIAL DISCLOSURE: James Nichols participates on IL and Siemens Scientific Advisory Boards; receives speaker travel funds and honoraria from Abbott, Bio-Ra, and Roche; is on the Board of Directors for CLSI; and is the editor of Point of Care Testing journal.
FUNDING: No external funding.
POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.
- ↵Institute of Medicine. Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001
- Kohn LT,
- Corrigan J,
- Donaldson MS
- Smith M,
- Saunders R,
- Stuckhardt L,
- McGinnis JM
- Hawkins RC
- Kwok J,
- Jones B
- Lin JC,
- Strauss RG,
- Kulhavy JC,
- et al
- Coon ER,
- Quinonez RA,
- Moyer VA,
- Schroeder AR
- May TA,
- Clancy M,
- Critchfield J,
- et al
- Procop GW,
- Yerian LM,
- Wyllie R,
- Harrison AM,
- Kottke-Marchant K
- Langley GJ
- Colantonio DA,
- Kyriakopoulou L,
- Chan MK,
- et al
- Kliegman R,
- Nelson WE
- Thakkar RN,
- Kim D,
- Knight AM,
- Riedel S,
- Vaidya D,
- Wright SM
- ↵Subcommittee SoHMCW. Society of Hospital Medicine—Adult Hospital Medicine: Five things physicians and patients should question. Choosing Wisely Campaign. 2013. http://www.choosingwisely.org/doctor-patient-lists/society-of-hospital-medicine-adult-hospital-medicine/. Accessed August 27, 2014
- Rehmani R,
- Amanullah S
- Copyright © 2016 by the American Academy of Pediatrics