hosppeds
August 2016, VOLUME6 /ISSUE 8

Developing the Capacity for Rapid-Cycle Improvement at a Large Freestanding Children’s Hospital

  1. Evan S. Fieldston, MD, MBA, MSHPa,b,
  2. Jennifer A. Jonas, BSE, BAa,
  3. Virginia A. Lederman, MBAc,
  4. Ashley J. Zahm, MHAd,
  5. Rui Xiao, PhDe,
  6. Christina M. DiMichele, MSNf,
  7. Ellen Tracy, MSNf,
  8. Katherine Kurbjun, MSNf,
  9. Rebecca Tenney-Soeiro, MD, MSEda,b,
  10. Debra L. Geiger, MHSc,
  11. Annique Hogan, MDa,b and
  12. Michael Apkon, MD, MBA, PhDg
  1. aDivision of General Pediatrics,
  2. cOffice of Safety and Medical Operations, and
  3. Departments of dAnesthesiology and Critical Care Medicine, and
  4. fNursing, Children’s Hospital of Philadelphia, Philadelphia, Pennsylvania;
  5. bDepartments of Pediatrics and
  6. eBiostatistics and Epidemiology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; and
  7. gHospital for Sick Children, Toronto, Canada
  1. Address correspondence to Evan Fieldston, MD, MBA, MSHP, Children’s Hospital of Philadelphia, 34th St and Civic Center Blvd, Philadelphia, PA 19104. E-mail: fieldston{at}email.chop.edu
  1. Dr Fieldston helped conceptualize and designed the study and critically reviewed the manuscript; Ms Jonas drafted and revised the manuscript and contributed to data analysis; Ms Lederman, Ms Zahm, Ms DiMichele, Ms Tracy, Ms Kurbjun, Ms Tenney-Soeiro, Ms Geiger, Dr Hogan, and Dr Apkon helped conceptualize and design the study and critically reviewed the manuscript; Dr Xiao contributed to data analysis and critically reviewed the manuscript; and all authors approved the final manuscript as submitted.

Abstract

BACKGROUND: To develop the capacity for rapid-cycle improvement at the unit level, a large freestanding children’s hospital designated 2 inpatient units with normal patient loads and workforce as pilot “Innovation Units” where frontline staff was trained to lead rigorous improvement portfolios.

METHODS: Frontline staff received improvement training, and interdisciplinary teams brainstormed ideas for tests of change. Ideas were prioritized using an impact-effort evaluation and an assessment of how they aligned with high-level goals. A template for each test summarized the following: the opportunity for improvement, the test being conducted, dates for the tests, driver diagrams, metrics to measure effects, baseline data, results, findings, and next steps. Successful interventions were implemented and disseminated to other units.

RESULTS: Multidisciplinary staff generated 150 improvement ideas and Innovation Units collectively ran >40 plan-do-study-act cycles. Of the 10 distinct improvement projects, elements of all 10 were deemed “successful” and fully implemented on the unit, and elements from 8 were spread to other units. More than 3 years later, elements of all of the successful improvements are still in practice in some form on the units, and each unit has tested >20 additional improvement ideas, using multiple plan-do-study-act cycles to refine them.

CONCLUSIONS: The Innovation Unit model successfully engaged frontline staff in improvement work and established a sustainable system and framework for managing rigorous improvement portfolios at the unit level. Other hospitals and health care delivery settings may find our quality improvement approach helpful, especially because it is rooted in the microsystem of care delivery.

Hospital leaders are responsible for driving performance across multiple quality domains including safety, clinical effectiveness, patient experience, and financial stewardship.1,2 However, although executives can develop strategies, unit managers and frontline staff are the ones who have to effectively change processes to drive performance while simultaneously managing a busy clinical operation. Thus, leadership at our large freestanding children’s hospital sought to research, design and pilot a quality improvement model that would leverage elements of existing models but would focus specifically on embedding rapid-cycle improvement capabilities in regular inpatient units with a normal workforce and patient load.

In fiscal year 2013, 2 fully functioning inpatient units were selected to serve as pilot “Innovation Units” (Table 1). Managers and frontline staff on these units were trained to lead local improvement portfolios to affect outcomes across a series of strategic domains. Leading an improvement portfolio involved encouraging staff to propose ideas for ways to achieve high-level outcomes, planning and running plan-do-study-act (PDSA) cycles to test ideas, establishing appropriate metrics and systematically tracking progress over time, and spreading successful improvements throughout the unit and to other units.

TABLE 1

Descriptive Information for the 2 Innovation Units

One of the premises of the initiative was that “innovative” ideas did not have to be completely revolutionary but could simply be a modification of an existing process or object.3 Another premise was that multiple PDSA cycles could occur simultaneously. Because unit-based improvement activities had previously focused on 1 or a limited number of initiatives at a time, the idea of managing a portfolio of projects with differing degrees of complexity and maturity dates was new. The intent of this staggered approach was to allow units to test many more innovations and to ensure that improvement work was happening continuously.

The long-term, high-level goals of the Innovation Unit initiative were to improve outcomes associated with strategic objectives around patient safety, quality improvement, patient/family satisfaction, and cost reduction. However, the goal of the pilot was to assess whether the model could successfully accomplish the following: (1) engage unit-based leadership and multidisciplinary frontline staff in quality improvement work, (2) encourage the proposal of new ideas, (3) educate unit-based staff in the clinical setting, (4) incorporate a sustainable system and framework to assist unit-based leadership in managing a rigorous portfolio of work, and (5) stimulate a culture of continuous quality improvement at the unit level that would mirror the organization’s long-standing spirit of innovation in biomedical science and clinical care.

Methods

Design and implementation of the Innovation Unit initiative included 7 focus areas, including: (1) design of project governance structure, (2) kickoff and training, (3) engagement and idea generation, (4) idea prioritization and buy-in, (5) implementation and communication, (6) portfolio framework and monitoring, and (7) sustainability and spread.

Project Governance Structure

The hospital’s then Chief Medical Officer (M.A.) and Chief Nursing Officer served as executive sponsors and provided high-level guidance and direction to the project team. A general pediatrics hospitalist, who served as Medical Director of Care Model Innovation (E.F.), and a Nursing Director (E.T.) cosponsored the initiative. In addition to providing regular guidance, they oversaw all aspects of implementation. Improvement Advisors (1.75 full-time equivalents) from the Office of Safety and Medical Operations were also assigned to the project. The core project team consisted of nurse managers, medical directors, nursing leaders, and a chief resident. A steering committee consisting of managers and directors from the departments of pediatrics, pharmacy, supply chain, and other key areas was also assembled to assist with leveraging technology, connecting with subject matter experts, and aligning the tests of change with other hospital initiatives.

Kickoff and Training

Kickoff meetings were held over a 6-week period to introduce multidisciplinary staff to the initiative. After the kickoff, all unit-based staff, ranging from skilled nursing assistants to medical directors, were required to attend educational sessions run by local improvement advisors that provided a general “Introduction to Improvement” as well as a more specific introduction to the hospital’s improvement framework. Training beyond these basic modules was tailored to the person’s role in the project. Members of the core team were required to complete modules on “Running PDSA Cycles,” “Creating Driver Diagrams,” and “Managing Costs of Care.” They were also asked to complete quality improvement and patient safety modules through the Institute for Healthcare Improvement (IHI) Open School.4 Staff members involved in specific tests of change were asked to complete concise, modular, just-in-time training as appropriate. Unit leaders monitored the completion of all required modules for staff on their unit.

Engagement and Idea Generation

Physicians, nurses, advanced practice nurses, social workers, case managers, and other staff attended brainstorming sessions to generate lists of improvement opportunities linked to the high-level goals. Unit-specific e-mail addresses were also established for staff to submit ideas. Using the framework of business model innovation, project sponsors emphasized that small changes could lead to significant improvements in patient outcomes, and therefore, ideas did not need to be completely revolutionary but could simply be enhancements or tweaks to current processes or common objects. For example, they noted that Netflix did not invent the DVD, but simply changed the way consumers obtain them. Similarly, improvements in patient care could result from “Netflix ideas” and did not require “inventing the DVD.”

Idea Prioritization and Buy-in

Once the list of potential projects was generated, ideas were organized into categories and subcategories based on the quality domain that they addressed. For example, a “hospital acquired conditions” category was subdivided into groups for central line–associated blood stream infections, nosocomial viral infections, and peripheral intravenous infiltrates. To select which ideas to test, the core team created a prioritization matrix that included an “impact-effort” evaluation and an assessment of how well the idea aligned with high-level goals (Fig 1).

FIGURE 1

The project team used this prioritization matrix to select which quality improvement ideas to test on the Innovation Units. The matrix includes an “impact-effort” evaluation and an assessment of how well the idea aligned with high-level goals.

Implementation and Communication

Once projects were selected, unit leaders presented plans for the tests of change to frontline staff at regularly scheduled multidisciplinary team huddles, and the chief resident presented plans to resident teams. Each stakeholder group discussed steps necessary to implement the changes, and volunteers were chosen to champion and monitor tests on a daily basis.

Multiple methods of communication were used throughout the process. The core team had weekly check-ins with project sponsors and improvement advisors. To remind frontline staff members about their role in implementing the tests, the core team used multiple forms of communication, including e-mail, in-person discussions, paper reminders, and instructional videos. Physicians updated their specialty divisions, and the chief resident communicated updates to residents. To keep other stakeholders informed, the steering committee attended quarterly meetings and received updates about the status of the projects. A series of 5-minute videos provided updates on the initiative and on specific interventions. Toward the end of the pilot, an internal Web site was launched with links to education modules and videos and details about the tests of change.

Portfolio Framework and Monitoring

To help units manage each project, the core team designed a template that outlined the portfolio framework. Each PDSA cycle used this template to summarize the following: the opportunity for improvement, the test being conducted, the dates and locations of the test, a driver diagram linking the test to high-level goals, the metrics being used to measure the effects of the intervention, baseline data, test results, key findings, and next steps (Fig 2). The template also specified which staff members were responsible for specific tasks. In some instances, students were used to observe and track behavior changes.

FIGURE 2

Snapshot of a template from the Innovation Units’ quality improvement portfolio. These templates were used to help manage and track the status of each test of change.

Sustainability and Spread

Once a test was completed, unit-based teams evaluated the intervention using quantitative data and qualitative feedback from staff. If the intervention was deemed successful, it was fully implemented on that unit. To spread the idea to other units, unit leaders presented the results of their successful tests in various forums including regularly scheduled daily calls with nurse managers from every unit, nursing council meetings, and the steering group meetings. Changes were also incorporated into training modules and onboarding materials.

Improvement Example

One of the project areas that both units selected was increasing clinician compliance with medication reconciliation within 24 hours of admission. Some of the PDSA cycles involved increasing data transparency, reeducating residents about the process, increasing soft alerts such as verbal reminders from senior residents during rounds, and sending weekly e-mail reminders to physicians. Building on lessons learned from these tests, the final PDSA cycle iteration involved working with informatics resources to develop and implement an inline alert in the computerized order-entry system.

Results

Engagement and Idea Generation

More than 170 staff members and 70 physicians attended the kickoff meetings. Within 2 months, multidisciplinary staff generated >150 ideas during the brainstorming sessions and through e-mail submissions. The project team also received requests from staff across the organization to use the Innovation Units as testing ground for their ideas.

Education

In addition to the introductory education sessions, 48 staff members from the 2 units completed quality improvement training modules through the IHI Open School (Table 2). A core group of 8 staff members completed all 6 quality improvement modules, which included 23 individual lessons and accounted for >8 hours of educational time. Other staff completed select modules that were relevant to their role. Collectively, staff on the units completed 498 quality improvement lessons, which accounted for >160 hours of quality improvement training.

TABLE 2

IHI Open School Educational Modules

Improvement Capacity at the Unit Level

Each unit selected 6 or 7 improvement projects for the pilot year. Because there was some overlap between the projects on the 2 units, altogether there were 10 distinct projects. Each project included multiple PDSA cycles, so collectively the units ran >40 tests of change during the year. A snapshot of 1 portfolio a few months into the pilot showed that the unit was engaged in designing a rounding checklist, fully implementing a safety and care coordination team huddle, and planning the dissemination of a charge nurse safety report to other units.

Sustainability and Spread

Of the 10 distinct improvement project areas from the pilot, some elements of all 10 were ultimately deemed successful and fully implemented on the unit. However, not all elements of each project “bundle” were successful. A stethoscope hygiene element of the project that aimed to reduce nosocomial viral infections, for example, was abandoned due to consensus that it did not provide a net benefit. Of the 10 distinct projects areas, elements of 8 were spread to other units. More than 3 years later, all 10 of the projects that were fully adopted on the units during the pilot are still in practice in some form, although many have since been further refined.

In addition to the sustainability of the individual improvements, the Innovation Units have sustained their capacity for rapid-cycle improvement. In the 3 years since the pilot, the units have each tested >20 improvement ideas using iterative PDSA cycles to refine each one.

Improvement Example

In the 3 months before the pilot, the rate of medication reconciliation within 24 hours of admission on the Innovation Units was 52%, 58%, and 45% while hospital-wide compliance was between 58% to 59%. Two months postintervention, rates on the Innovation Units increased to 73% and remained high for the remainder of the pilot, with some months exceeding 90%. Although hospital-wide rates also increased during this time to ∼69%, the Innovation Units consistently outperformed the hospital-wide average with differences ranging from 4 to 26 percentage points. In the months that followed implementation of the inline alert, compliance rates continued to increase, and after 6 months, compliance remained steadily >95%. When these improvements were spread throughout the hospital, hospital-wide compliance increased to >90% and has remained steady between 94% to 97% (Fig 3).

FIGURE 3

Results from the Innovation Unit project on medication reconciliation.

Discussion

Continually fostering new ways of thinking at the frontline and training frontline staff in a rapid, relevant way is essential to achieving sustainable changes that benefit patients.5-9 Thus, we sought to design and pilot a quality improvement model that would engage frontline staff and ultimately lead to measurable improvements in patient- and hospital-level outcomes.

The approach described in this article differs from other approaches in a few ways. First, it trains large numbers of frontline staff in the clinical setting without dedicating significant resources in the form of time or money. Although there are benefits to sending staff away for extensive training, it is not practical for many organizations. Our approach is also highly efficient in that it uses targeted, just-in-time training and reinforcement.

Second, other improvement-via-innovation approaches have elements that operate outside of “clinical microsystems,” which are subcultures of clinical and nonclinical staff with their own structures and processes that provide care to specific patient cohorts.10 For example, some organizations use designated “Innovation Designers” or “Chief Innovation Officers” to identify improvement opportunities, design and test changes, and disseminate them to patient care areas or to lead teams of innovators across the organization, whereas other models use “Centers for Innovation” or “Learning Laboratories” to study improvement ideas.11-14 However, because clinical microsystems are well suited to engage frontline staff in unit-level improvement activities,15 we adapted elements of existing models to focus on engaging and educating staff within the clinical microsystem and building the capacity for rapid-cycle improvement at the unit level.

During the pilot year, the Innovation Unit initiative successfully engaged multidisciplinary frontline staff in improvement work and established a system and framework to help units manage a rigorous improvement portfolio. Multidisciplinary staff generated >150 improvement ideas, and in addition to attending educational sessions led by local improvement advisors, 48 staff members completed modules through the IHI Open School. With this base of engagement and training, each unit successfully managed a portfolio of 6 to 7 improvement projects, which included >20 PDSA cycles, and successfully spread almost all of the improvements to other units. More than 3 years later, all improvements that were deemed successful are still in practice, and in each year since the pilot, the units have continued to manage a portfolio of ∼6 new projects, each with multiple PDSA iterations.

In addition to incorporating improvement skills into the core competencies of unit-based staff, the initiative raised awareness among staff members about their responsibility to contribute to continuous quality improvement. Early indications of positive trends on key metrics also set an example for other units, and since the pilot, Innovation Unit teams have played an integral role in advising other units about quality improvement.

Factors That Contributed to the Initiative’s Success

One factor that was critical to the success of the initiative was the high level of senior leadership engagement and support since its inception.16 The chief medical officer and chief nursing officer rounded on the units, and the project was publicized and acknowledged in multiple venues that included operational and clinical leadership.

Establishing a strong leadership structure on the units was also critical.16 Each unit had a strong nurse manager and medical director championing the work on a daily basis. Unit leaders also had dedicated time for quality improvement, which allowed them to focus on designing, communicating, implementing, and tracking tests without having to balance improvement work with patient care. Another key element was that the portfolio templates clearly delineated who were responsible for each aspect of managing the tests. In addition to ensuring accountability, this ensured that everyone was aware of his or her specific role.

Engaging multidisciplinary staff, including nurses, attending physicians, and residents from the beginning was also important.16 Because all stakeholders were included in the kickoff and encouraged to participate to the brainstorming process, there was a high level of awareness and support for the initiative. Providing relevant, just-in-time training, keeping staff informed about the progress of the projects, and continually reminding staff members about what they needed to do to implement the changes was also key.

Another factor that contributed to maintaining engagement was providing opportunities to highlight small successes.16 Staff members who were integral to the design, implementation, or tracking of successful tests were asked to present their project in venues ranging from local staff meetings to hospital-wide forums. These presentations helped raise awareness about successful improvements while providing an opportunity to acknowledge teams for their efforts.

Challenges and Limitations

Some of the challenges of designing improvement projects involved obtaining baseline measures and identifying appropriate target outcomes. For interventions aimed at improving care coordination, for example, it was difficult to quantitatively assess the current state and track changes. In these cases, qualitative feedback was used. Tracking compliance with certain interventions, such as hand-hygiene protocols, was also difficult.

Another challenge involved demonstrating associations between unit improvements and high-level outcomes. Because multiple tests were running simultaneously and because so many factors contribute to metrics around patient safety, quality, satisfaction, and costs, it was difficult to assess the direct effects of the interventions. This was compounded by the fact that clear data for these outcome metrics are difficult to obtain. However, for the pilot, demonstrating causality was less important than building the capacity for rapid-cycle improvement.

Measuring whether we were successful in initiating a culture change was also difficult because we did not have baseline data on staff’s perceptions of the culture on the units. However, qualitative feedback was positive, and since the pilot, the Office of Safety and Medical Operations started administering a survey to gauge perceptions of culture and capability that units are now using to target training and track progress.

Ensuring the long-term sustainability of the improvement model also remains an ongoing challenge. Although we have tried to embed improvement work in the daily functioning of the units and to distribute responsibilities so that continuity does not rely on a few individuals, we still experience setbacks when key staff members leave. However, incorporating quality improvement responsibilities into job descriptions and maintaining protected time has helped to mitigate this.

Conclusions

The Innovation Unit model succeeded in engaging frontline staff in quality improvement work and in establishing a framework for managing rigorous improvement portfolios at the unit level. Although the process was initiated using external resources in the form of dedicated time from improvement advisors, since the pilot, the units have sustained the model independently. Other hospitals and health care delivery settings may find our approach to improvement helpful, particularly because it is rooted in the microsystem of care delivery.

Acknowledgments

We thank the physicians and nurses who led this work on the units, including Jennifer Sullivan, Elena Becker, Regina Edge, Mary Ann Gibbons, Sarah Murawski, Kim Smith-Whitley, and Erika Leep.

Footnotes

  • FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.

  • FUNDING: Internal funds from the Children’s Hospital of Philadelphia

  • POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.

References