hosppeds
March 2015, VOLUME5 /ISSUE 3

Multicenter Development, Implementation, and Patient Safety Impacts of a Simulation-Based Module to Teach Handovers to Pediatric Residents

  1. David P. Johnson, MDa,
  2. Kanecia Zimmerman, MD, MPH2,3,
  3. Betty Staples, MD2,
  4. Kathleen A. McGann, MD2,
  5. Karen Frush, MD2 and
  6. David A. Turner, MD2,3
  1. 1Department of Pediatrics, Division of Hospital Medicine, Monroe Carell Jr. Children’s Hospital at Vanderbilt, Nashville, Tennessee; and
  2. 2Department of Pediatrics, and
  3. 3Division of Pediatric Critical Care, Duke Children’s Hospital, Durham, North Carolina
  1. Address correspondence to David P. Johnson, MD, 2200 Children’s Way, 8020 Vanderbilt Children’s Hospital, Nashville, TN 37232-9452. E-mail: david.p.johnson.1{at}vanderbilt.edu

Abstract

Objective: Teaching and evaluation of handovers are important requirements of graduate medical education (GME), but well-defined and effective methods have not been clearly established. Case-based computer simulations provide potential methods to teach, evaluate, and practice handovers.

Methods: Case-based computer simulation modules were developed. In these modules, trainees care for a virtual patient in a time-lapsed session, followed by real-time synthesis and handover of the clinical information to a partner who uses this information to continue caring for the same patient in a simulated night scenario, with an observer tallying included handover components. The process culminates with evaluator feedback and structured handover education. Surveys were used before and after module implementation to allow the interns to rate the quality of handover provided and record rapid responses and transfers to the ICU.

Results: Fifty-two pediatric and medicine/pediatric residents from 2 institutions participated in the modules. “Anticipatory guidance” elements of the handover were the most frequently excluded (missing at least 1 component in 77% of module handovers). There were no significant differences in the proportion of nights with rapid response calls (7.24% vs 12.79%, P = .052) or transfers to the ICU (7.76% vs 11.27%, P = .21) before and after module implementation.

Conclusions: Case-based, computer-simulation modules are an easily implemented and generalizable mechanism for handover education and assessment. Although significant improvements in patient safety outcomes were not seen as a result of the educational module alone, novel techniques of this nature may supplement handover bundles that have been demonstrated to improve patient safety.

  • handovers
  • sign-outs
  • transition of care
  • computer simulation
  • resident education
  • patient safety

Handovers, or the transitions of responsibility between providers, are unique aspects of patient care that incorporate both communication and the continuum of care, 2 of the most-cited factors for sentinel events in the inpatient hospital setting.1 For trainees, recent changes in GME have led to decreases in the maximum number of hours worked per week by residents,2,3 and the inevitable consequence of more handovers per patient during a given hospitalization. These changes heighten the importance that each person involved in the patient’s care has a complete understanding of the patient’s disease process and intended management course. Such understanding cannot occur without clear, succinct, and complete handovers.

Data demonstrate that resident handovers currently may lack pertinent clinical information and contribute to medical errors,411 including delayed diagnosis, delayed therapy, and death.5,8,9 A survey of internal medicine training programs found standard handover process and content lacking within individual training programs and across programs and hospitals.10 Along with systematic inconsistency, trainees often omit important patient information in their handovers,4,11 including expected patient trajectory7 and clinical condition.12 Surveyed pediatric residents on inpatient services indicated that events often occurred for which they were not adequately prepared.6

Although there is general consensus that standardization is an important aspect of improving handovers,6,1317 it is unclear what complementary methods will create a more robust handover training experience. Currently, no validated methods exist to teach or evaluate this process, and until a recent report by Starmer et al,18 there were no studies demonstrating any handover training interventions that improve patient safety.15 Historically, junior physicians learned handovers through observation and trial and error,13 with very few medical schools or GME programs specifically focusing on handovers before the most recent Accreditation Council for Graduate Medical Education duty-hour changes.10 Currently, programs are struggling to teach handovers effectively,16 and the Accreditation Council for Graduate Medical Education acknowledges the inherent risk of increased handovers related to duty-hour adjustments, requiring handovers be monitored to ensure resident competency.3 As such, simulation has been suggested as a possible tool to develop handover skills14 and has been used as a training method.19,20

However, to our knowledge, simulating patient care events leading up to and after a patient handover has not yet been attempted. In an effort to improve handover education in 2 programs and monitor the potential impact on patient safety, we developed computer-based simulation modules that allow trainees to gather and synthesize relevant clinical information to provide concise, real-time peer-to-peer handovers with formal teaching and immediate evaluation in a safe learning environment. We sought to investigate whether the implementation of this educational program could produce a demonstrable improvement in patient outcomes through proxy patient safety data.

METHODS

Handover Process at Study Sites

The standard handover process at both sites was similar. Handovers at both institutions generally occurred separately as intern-to-intern and resident-to-resident, but there was some variability based on team preferences each month. Each site had electronic health record support that imported basic patient information, including name, room number, age, recent vital signs, and inpatient medications, into a printed handover document. These handover documents were then manually updated daily with pertinent clinical information, tasks to accomplish, and anticipatory guidance as appropriate. For verbal handover structure, Site 1 used the SIGNOUT21 and Site 2 used the I-PASS22 mnemonic.

Module Development

After institutional review board approval, 2 computer-based simulation modules were developed by using Microsoft PowerPoint software (Microsoft, Seattle, WA). Each case consists of a 2-part scenario that takes the trainees through a complete time-lapsed patient encounter. The first component of the modules is a time-lapsed simulation of the first 12 hours of a patient admission meant to replicate a “day shift” resident’s interaction with the patient, family, and team with the progression of the slides “moving time forward” to reveal new patient information as the case evolves. This scenario includes a simulated history and physical examination, simulated team rounds that provide clinical advice regarding potential progression of the disease, laboratory and radiographic data, and multiple points where the trainee must make clinical decisions based on his or her best clinical judgment. Trainees’ management decisions alter the patient’s course during the simulation, but specific “responses” to decision-making force scenarios to end with the same clinical outcome so that each trainee has a standardized set of clinical information. The second component of the module consists of the continued management of the same patient over a time-lapsed “night shift,” during which the trainee’s clinical decisions alter the clinical course of the simulated patient.

Each exercise is done in pairs. The first trainee (Trainee A) completes the initial component of the case while the second trainee (Trainee B) waits, and then provides verbal and written handover of this simulated patient to a Trainee B as she or he would during an actual patient handover. Trainee B then uses only the information obtained from the handover to complete a similar computerized, time-lapsed clinical scenario representing the continued overnight progression, making clinical decisions based on the handover. After completion of the first patient scenario, the trainees reverse roles to complete a second simulated case (Fig 1).

FIGURE 1

Schematic diagram of the entire educational module demonstrating the initial scenario participation and handover by Trainee A to Trainee B. Trainees then switch roles, and the module is followed by feedback and education.

Using expert review and the modified Delphi technique, 11 key points were identified as crucial elements for each module, and included elements of past medical history (2), problem list (4), pending tests (1), anticipatory guidance (2), and overnight tasks (2). During the face-to-face handover, a facilitator tallies the 11-item checklist in real time, followed by a debriefing during which they facilitate peer-to-peer feedback and provide structured education. Structured education involved the facilitator reading a 5-paragraph summary regarding the importance of handovers in patient care as well as providing specific feedback about the handovers observed during the modules. Datasheets were collected by using anonymous, unique identifiers for the participants. The entire session for both simulated cases can be done in 30 to 45 minutes per pair, but multiple pairs can be working at the same time.

Handover Education Impact Analysis

In an effort to determine the module’s impact on patient safety events, first-year trainees at both institutions were surveyed (Survey Monkey, Palo Alto, CA) on each of their call nights before and after the groups participated in the educational module. Therefore, each intern was surveyed multiple times. The survey was adapted from a previously published tool with verbiage altered to reflect the systems in the 2 study centers (eg, specific health records) and questions added regarding rapid response calls and ICU transfers.6

Analysis

Data were examined and cleaned for logic and keying errors, with exclusion of 8 data points because of entering errors. To address whether the intervention was associated with a change in patient outcomes, the change from preintervention to postintervention was analyzed on the key set of identified survey measures. Because respondents were not individually matched at the 2 time points, aggregate measures were analyzed with the pre- or postintervention as the classification variables; t-tests or χ2 tests were used as appropriate. Testing was conducted separately by site, and aggregated over both sites. Analysis was done using SAS version 9.3 (SAS Institute, Inc, Cary, NC). Significance for the multiple outcome measures was maintained at a P value of .05.

RESULTS

Module Implementation

Thirty pediatric and pediatric/internal medicine interns at Monroe Carell Jr. Children’s Hospital at Vanderbilt and 22 at Duke Children’s Hospital completed the modules. During the exercise, the participants included an average of 8 (74%) of the 11 elements deemed important. The most commonly excluded components were the patient’s allergies (75%) and the 2 components regarding anticipatory guidance for fever (65%) and intravenous access (44%). At least 1 component of anticipatory guidance was excluded in 40 (77%) of 52 of the module handovers.

Postcall Survey Analysis

The survey was sent 365 times preimplementation and 478 times postimplementation, with response rates of 221 (61%) of 365 preimplementation and 219 (46%) of 478 postimplementation. Call night characteristics were similar in the pre- and postmodule periods, with the exception of the perceived business of the call night and the number of patients for whom they were responsible. Interns at both institutions rated call nights as less busy on a 5-point Likert scale (1 very slow to 5 very busy) after the implementation of the module (3.1 vs 2.9, P = .044) (Table 1). Percentage of nights with rapid responses and ICU transfers did not differ.

TABLE 1

Postcall Survey Responses Before and After Intervention

On a 5-point Likert scale (1 inadequate to 5 excellent), handover quality was rated as a 3.6 before and 3.7 after the modules (P = .30). The percentage of nights with perceived unanticipated events did not statistically change (20.5% vs 18.8%, P = .66).

DISCUSSION

Handovers are vital to patient safety, and numerous educational initiatives have been attempted in this area. The handover modules developed in this multicenter study sought to address interpretive skills lacking in previous methods,13 while also including components that have been deemed useful, including direct supervision, structured process,17 and face-to-face handovers.20 Despite the innovative handover training method described, we were not able to detect patient outcome benefits by using proxy patient safety data.

To date, Starmer et al18 have provided the only investigation demonstrating improved patient outcomes with a handover bundle intervention, but this required significant resources to record and document patient safety data. The centers in the current study were already using many of the components described in the Starmer et al18 investigation, with both centers having imported components from an electronic health record in a handover document, by using structured handover mnemonics, and attempting to complete handovers in quiet areas. However, team handovers were not the norm at either institution. The modules described here could be seen as complementary in many ways to the Starmer et al18 approach, with the 2-hour team communication training and monthly handover oversight being replaced with modules that included direct observation, structured education, and the innovative emphasis on the use of trainees’ interpretive skills to guide the handover content (an educational approach not yet described in handover education). The importance of using interpretive skills to guide handovers in this manner is illustrated in this investigation, as the elements of anticipatory guidance were frequently excluded, as has been previously illustrated in actual handover observations.7

As in most previous studies,15 the true patient safety impact of handover training remains elusive, although the total number of unexpected events did decline in the postimplementation period. Although not statistically significant, it is possible that a larger sample size or additional data would demonstrate that educational approaches, such as those undertaken in this project, improve both communication and patient-level outcomes. The lack of a significant decrease in the number of rapid responses called or transfers to a higher level of care also may result from variables other than handovers. Increased winter month acuity seen in many children’s hospitals during the postimplementation phase also may have contributed to the possible trend toward higher rates of rapid response teams and ICU transfers after implementation. In addition, it is conceivable that better handover led to increased lines of communication and earlier rapid responses and more “safe” transfers to the ICU. Another possible contributor was survey fatigue, with surveys more likely to be completed on nights when an event occurred during the postimplementation period.

Along with the strengths and novel characteristics, this study also has limitations. First, the handover environment during these scenarios lacks many distractors present during actual handovers. This study also included only a single patient, in contrast to the transfer of information for multiple patients, which is common in the actual clinical environment. These modules were designed to address the format and process of transferring the care of a patient, but not to address the additional complexities inherent in handovers that involve multiple patients. In addition, virtual scenarios are not identical to caring for actual patients. However, as in the US Medical Licensing Examination Step 3, scenarios can take trainees through the thought process of patient care. Finally, obtaining survey data over long periods to assess the impact of an educational method may not be ideal. The response rates fell after implementation, which is perhaps not surprising given the interns were receiving the survey each night shift for almost 7 complete months. In addition, handovers are a skill that is generally learned through experience, making it difficult to determine if any changes in handover quality or patient safety outcomes are a result of an intervention or natural skill development. Further investigations into handover training will have to take these factors into consideration.

CONCLUSIONS

Patient safety events are often related to deficient patient handovers, but because of their multifactorial nature, determining the impact of an educational handover initiative on patient safety remains difficult. In addition, case-based computer simulation in this investigation was a well-received method to teach handovers in 2 separate pediatric residency programs. Generalizable modules, such as these, provide a potential avenue to supplement a handover bundle and objectively monitor trainees’ progress through the Pediatric Milestones.23 Although we were not able to detect an improvement in events that we selected to represent potential patient safety proxies, modules such as those presented here may effectively augment handover bundles and incorporate clinical decision-making.

Acknowledgments

Special thanks to members of the Duke Office of Clinical Research, especially Richard Sloane, for their help with data analysis. We would also like to thank the pediatric and medicine/pediatric residents at both Monroe Carell Jr. Children’s Hospital at Vanderbilt and Duke Children’s Hospital for taking the time to particpate in the modules and complete the surveys, the Pediatric Residency leadership at both institutions for supporting the study, and all those that helped facilitate the modules including the pediatric chief residents at both institutions, the leadership of both programs, as well as many of the faculty of the Division of Pediatric Hospital Medicine at Vanderbilt.

Footnotes

  • FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.

  • FUNDING: No external funding.

  • POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.

GME
graduate medical education
IT
information technology
RRTs
rapid response teams

REFERENCES

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.