Skip to main content
Skip to main content

AAP Gateway

Advanced Search »

User menu

  • Login
  • Submit Manuscript
  • Alerts
  • Subscribe
  • aap.org

Menu

  • AAP Grand Rounds
  • AAP News
  • Hospital Pediatrics
  • NeoReviews
  • Pediatrics
  • Pediatrics in Review
  • Journal CME
  • AAP Career Center
  • AAP Current Policy
  • Pediatric Collections
  • AAP Journals Catalog

Sections

    • Login
    • Submit Manuscript
    • Alerts
    • Subscribe
    • aap.org

    Sign up for Insight alerts highlighting editor-chosen studies with the greatest impact on clinical care.

    Know what's next when you read AAP Journals, view the new 2018 Catalog.

    Advertising Disclaimer »

    Tools and Links

    Hospital Pediatrics
    February 2015, VOLUME 5 / ISSUE 2
    From the American Academy of Pediatrics
    Brief Reports

    Evaluating Simulation Education Via Electronic Surveys Immediately Following Live Critical Events: A Pilot Study

    Corinne Savides Happel, Meredith A. Lease, Akira Nishisaki, Matthew S. Braga
    • Article
    • Figures & Data
    • Supplemental
    • Info & Metrics
    • Comments
    Loading
    Download PDF

    Abstract

    Background and Objectives: Simulation-based medical education has become popular in postgraduate training for medical emergencies; however, the direct impact on learners’ clinical performances during live critical events is unknown. Our goal was to evaluate the perceived impact of simulation-based education on pediatric emergencies by auditing pediatric residents immediately after involvement in actual emergency clinical events.

    Methods: Weekly team-based pediatric simulation training for inpatient emergencies was implemented in an academic tertiary care hospital. Immediately after actual pediatric emergency events, each resident involved was audited regarding roles, performed tasks, and perceived effectiveness of earlier simulation-based education. The audit was performed by using a Likert scale.

    Results: From September 2010 through August 2011, a total of 49 simulation sessions were held. During the same period, 27 pediatric emergency events occurred: 3 code events, 14 rapid response team activations, and 10 emergency transfers to the PICU. Forty-seven survey responses from 20 pediatric residents were obtained after the emergency clinical events. Fifty-three percent of residents felt well prepared, and 45% reported having experienced a similar simulation before the clinical event. A preceding similar simulation experience was perceived as helpful in improving clinical performance. Residents’ confidence levels, however, did not differ significantly between those who reported having had a preceding similar simulation and those who had not (median of 4 vs median of 3; P = .16, Wilcoxon rank-sum test).

    Conclusions: A novel electronic survey was successfully piloted to measure residents’ perceptions of simulation education compared with live critical events. Residents perceived that their experiences in earlier similar simulations positively affected their performances during emergencies.

    • graduate medical education
    • code blue
    • patient simulation
    • pediatrics

    Pediatric in-hospital emergencies are not rare events1; nevertheless, they are not sufficiently frequent for resident trainees to gain and maintain skill competence. Simulation-based education enhances resident learning in critical skills.2 For this reason, pediatric residency programs increasingly use simulation-based education to prepare residents for critical events.3 Simulation training for pediatric trainees has been proven effective to improve technical and teamwork skills in a training environment.4–8 Limited studies have attempted to evaluate the clinical impact of simulation-based education. One study correlated improved pediatric cardiopulmonary arrest survival rates with introduction of simulation training in a hospital.9 However, the perceived impact of simulation-based education on learners’ actual clinical performances during emergencies has not been measured.

    To address this question, we designed a pilot study to evaluate the perceived clinical impact of simulation-based education. We specifically attempted to link learners’ perceptions of training with their clinical experiences during pediatric emergencies.

    METHODS

    With local institutional review board approval, this pilot study was conducted from September 2010 through August 2011 at the Children’s Hospital at Dartmouth (CHaD), a pediatric tertiary teaching hospital located in Lebanon, New Hampshire. The consent from each learner was given through voluntary participation. CHaD is part of Dartmouth-Hitchcock Medical Center, a large academic center with 396 total beds; pediatric residents are responsible for the pediatric inpatient unit comprising 20 beds, the PICU comprising 10 beds, and the neonatology unit comprising 30 beds. The hospital has a pediatric residency program accredited by the Accreditation Council for Graduate Medical Education; the program has 7 residents per year.

    At CHaD, simulation-based education has been integrated into pediatric residency training over the last 5 years. A mock code program was designed that calls for repeated resident participation to minimize the decline of knowledge and skills gained from simulation-based education over time.10 The program was overseen by the senior author (M.S.B.), a critical care physician with 6 years of clinical and simulation training and debriefing experience. In this program, pediatric residents were scheduled to participate in weekly mock codes during general pediatric inpatient unit and PICU rotations throughout their 3 years of training; on average, residents spent 4 months per year in these rotations. Mock codes were held in a high-fidelity simulation laboratory, and they occurred weekly at the same known time. Each mock code consisted of ∼30 minutes of a clinical scenario. Pediatric residents and rotating anesthesiology residents participated. Sessions were videotaped and replayed for participants at the debriefer’s discretion.

    The training scenarios were chosen by pediatric critical care faculty based on the team members’ earlier experiences with simulation as well as any recently identified gaps in clinical exposure. A total of 49 different scenarios were used, including respiratory arrest, asystole, ventricular fibrillation, supraventricular tachycardia, septic shock, status epilepticus and asthmaticus, increased intracranial pressure, and toxin exposure. Debriefing (led by an experienced pediatric critical care clinician as a content expert and an experienced debriefer as a simulation education expert) was performed immediately after each scenario and lasted 15 to 30 minutes. The content of the debriefing covered both medical and crisis resource management topics; the methods of debriefing followed either an advocacy-inquiry or a plus/delta format.11,12 Residents also participated in separate simulation sessions focused on technical skills, including tracheal intubation, central line placement, intravenous cannulation, and intraosseous needle placement.

    At CHaD, the pediatric code team consists of 2 pediatric residents on the general ward service, 1 pediatric resident on the PICU service, an on-call anesthesiologist, an on-call respiratory therapist, and an on-call PICU nurse. The rapid response team is composed of the same personnel but minus the anesthesiologist. The on-call PICU attending and relevant consulting services are notified of critical events and respond as appropriate; there are no overnight in-house PICU attending physicians, and the program has no PICU fellows.

    The research team implemented a notification system with a hospital-wide life support program. The research team was notified via an automated e-mail system within 24 hours after code blues, rapid response team activations, and urgent transfers from the regular pediatric unit to the PICU. Once notified, the research team reviewed code documentation and medical records and sent an electronic 12-question survey to all residents who participated in the event (Supplemental Information). The survey took ∼10 minutes to complete. Respondents were asked to describe the nature of the clinical event, their roles in the event, and their levels of perceived preparedness. Responders were also asked how previous simulation compared with the event and whether they felt the simulation had improved their performance. Two free-form questions allowed residents to describe any aspect of the critical event that was different than anticipated and to list any knowledge or skill they wish they would have had before the clinical event. This questionnaire was developed in a manner similar to that used by Hunt et al.1 It was reviewed and refined by 5 graduating residents; the pediatric simulation group at CHaD then further revised the survey for content validity.

    There was no sample size calculation because this research was planned as a pilot study for 1 year. Our unit of analysis was by response, not by resident, because training status and skill sets change over time. Therefore, residents were asked to respond after each distinct critical event in which they were involved.

    Items on the questionnaire regarding perception used a 5-point Likert scale (1, not at all; 2, little/few; 3, some; 4, well; and 5, very well). Those data are summarized with median and interquartile ranges (IQRs) because the sample size was small and the data were not normally distributed. To assess the impact of previous simulation training on resident perception on preparedness, the Wilcoxon rank-sum test was used due to the nonparametric nature of responses.

    RESULTS

    Twenty-seven clinical events were identified: 3 (11%) code blues, 14 (52%) rapid response team activations, and 10 (37%) emergency transfers to the PICU. The median age of the patient was 8 years (IQR: 2–15). Respiratory failure was the most common cause for the event (44%), followed by cardiac difficulty (33%). The median number of residents responding to the event was 3 (IQR: 2–3).

    A total of 74 surveys were sent, with a 64% response rate (47 surveys were completed [postgraduate year 1: 11; postgraduate year 2: 18; and postgraduate year 3: 18]). Twenty-one (45%) of 47 surveys indicated that residents perceived that they had participated in a previous simulation session that was similar to the critical event.

    Residents self-identified roles they had filled in the critical events. Fourteen (30%) of 47 residents served as team leader. Residents often reported talking with patient family members (49%, n = 23), calling consultants (38%, n = 18), and reviewing patient medical records (38%, n = 18). A small number of survey responses indicated that the corresponding resident performed a procedure during the described event: 5 (11%) reported performing chest compressions, 5 (11%) reported performing airway repositioning, and 1 reported (2%) performing tracheal intubation. The majority (74%, n = 35) of surveys indicated that the responding resident had not performed any procedure during the event.

    In terms of preparedness, most residents indicated that they felt well prepared for the critical event (median: 4 [well prepared]; IQR: 3–4)(Fig 1). The residents who perceived that they had previously participated in a similar simulation reported that the scenarios, simulation environment, and mannequins were realistic compared with actual clinical events (Table 1). These residents reported that the earlier simulation event positively affected their behaviors during the clinical event (Table 2). Residents’ confidence levels, however, did not significantly differ between those who reported having had an earlier similar simulation and those who had not (median of 4 [IQR: 3–4] vs median of 3 [IQR: 3–4]; P = .16, Wilcoxon rank-sum test).

    FIGURE 1
    • Download figure
    • Open in new tab
    • Download powerpoint
    FIGURE 1

    Sense of preparedness by residents who had experienced a similar earlier simulation versus those who had not. Forty-seven surveys were completed by 20 residents that described a total of 23 critical events. Twenty-one of the 47 surveys indicated that the participating resident had completed a similar simulation.

    View this table:
    • View inline
    • View popup
    TABLE 1

    Responses to the Following Questions Comparing Previous Simulation Experience With the Current Critical Event

    View this table:
    • View inline
    • View popup
    TABLE 2

    Self-Evaluation of How Previously Completed Simulation Experiences Affected Performance During the Current Critical Event

    DISCUSSION

    Simulation-based education is becoming popular in postgraduate medical training. With the Accreditation Council for Graduate Medical Education’s transition toward competency-based education, simulation will likely be formalized as a training and evaluation modality. The goal of the present pilot study was to evaluate the perceived impact of earlier similar simulations on learners’ preparedness. The capturing of residents’ perceptions of simulation directly after a critical event advances previous studies that have evaluated resident perceptions of simulation in isolation from clinical encounters. Live critical events seem opportune periods to identify perceived gaps in previous educational experiences.

    Participating residents reported that preceding similar simulations positively affected their performances during clinical events, including specifically knowing when to call for help, understanding medical management, and feeling confident to contribute knowledge (Table 2). The majority of residents reported confidence in the management of critical events regardless of whether they reported having experienced an earlier similar simulation. There are several possible explanations for this finding. First, insignificant confidence level differences between groups may be due to the small sample size inherent to this pilot study. Second, residents likely would have participated in ≥1 simulation even if they did not perceive that any were similar to the critical event, and during any simulation they would have been taught key behavior skills (including teamwork, role clarity, and communication skills) that could improve their preparedness. Finally, we acknowledge that self-confidence levels do not infer competence.13 Novice learners often overestimate their skill levels, and residents who had participated in a similar simulation may have been more critical of their own preparedness.

    From the perspective of pediatric residency training programs, the evaluation of simulation-based education should move from measuring trainee perceptions toward measuring behaviors in clinical encounters, as described in the Kirkpatrick evaluation model.14 From the perspectives of the hospitals and the patients, evaluating simulation-based education should move from measuring behavioral improvements in laboratory/simulation settings to measuring improved patient care practices and patient outcomes.15 In the present study, we explored the possibility of evaluating the perceived impact of simulation-based education on pediatric emergencies by auditing pediatric residents immediately after involvement in actual critical events. Although this level of analysis ranks low in the Kirkpatrick evaluation model (level 1), our assessment was paired with discrete clinical events, an action that has not, to our knowledge, previously been performed. In the future, direct observation and video recording can be used in simulations and critical events to more objectively measure the impact of simulation education. Direct observation and video recording are becoming common in trauma and neonatal resuscitation,16–20 although to our knowledge no one has paired observations of residents in simulations with those of residents in live events. Such a comparison may link specific simulation components with the improved patient outcomes9,21 that previously have been correlated with initiation of simulation programs.

    Finally, it is noteworthy that residents performed relatively few procedures during critical events. Although it was outside the scope of this project to assess details of procedural experience, this finding supports a growing concern regarding the emergency procedural competency of graduating pediatric residents.22

    The present study has several limitations inherent in a survey design. This research was a pilot study with a limited time frame, and any statistical inference is therefore likely underpowered. It is also important to emphasize that the measured effectiveness of simulation training relied on self-perception, which falls at the lowest level in the Kirkpatrick evaluation model.14 Our assessment of simulation training effectiveness was performed after clinical events, which could have introduced recall bias and might have positively or negatively affected the perceived simulation training effectiveness. We also had a lower than ideal response rate to our questionnaire. This outcome might have resulted in positively skewed results (ie, more engaged resident trainees might have responded selectively). Finally, as already discussed, it is a well-documented phenomenon that trainees report a false sense of confidence when skill levels are low (ie, the Dunning-Kruger effect).23

    CONCLUSIONS

    The novel technique of auditing residents after live critical events regarding perceptions of previous simulations was successfully piloted. Residents reported that their experiences in preceding similar simulations positively affected their performances during actual clinical events.

    Acknowledgments

    The authors wish to thank Marcy Singleton, APRN, Derek Callaway, PA, Heather Robertson, MD, and Ashley Sens, MD, for their leadership in simulation training sessions.

    Footnotes

    • FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.

    • FUNDING: This study was funded by Helen’s Fund at the Geisel School of Medicine. Corinne Happel was at Dartmouth-Hithcock Medical Center during the time of this study.

    • POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.

    CHaD
    Children’s Hospital at Dartmouth
    IQR
    interquartile range

    REFERENCES

    1. 1.↵
      1. Hunt EA,
      2. Patel S,
      3. Vera K,
      4. Shaffner DH,
      5. Pronovost PJ
      . Survey of pediatric resident experiences with resuscitation training and attendance at actual cardiopulmonary arrests. Pediatr Crit Care Med. 2009;10(1):96–105.
      OpenUrlCrossRefPubMedWeb of Science
    2. 2.↵
      1. Kory PD,
      2. Eisen LA,
      3. Adachi M,
      4. Ribaudo VA,
      5. Rosenthal ME,
      6. Mayo PH
      . Initial airway management skills of senior residents: simulation training compared with traditional training. Chest. 2007;132(6):1927–1931.
      OpenUrlCrossRefPubMedWeb of Science
    3. 3.↵
      1. Sam J,
      2. Pierse M,
      3. Al-Qahtani A,
      4. Cheng A
      . Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes. Paediatr Child Health (Oxford). 2012;17(2):e16–e20.
      OpenUrlPubMed
    4. 4.↵
      1. Donoghue AJ,
      2. Durbin DR,
      3. Nadel FM,
      4. Stryjewski GR,
      5. Kost SI,
      6. Nadkarni VM
      . Effect of high-fidelity simulation on Pediatric Advanced Life Support training in pediatric house staff: a randomized trial. Pediatr Emerg Care. 2009;25(3):139–144.
      OpenUrlCrossRefPubMedWeb of Science
    5. 5.
      1. Nadel FM,
      2. Lavelle JM,
      3. Fein JA,
      4. Giardino AP,
      5. Decker JM,
      6. Durbin DR
      . Assessing pediatric senior residents’ training in resuscitation: fund of knowledge, technical skills, and perception of confidence. Pediatr Emerg Care. 2000;16(2):73–76.
      OpenUrlCrossRefPubMedWeb of Science
    6. 6.
      1. Adler MD,
      2. Vozenilek JA,
      3. Trainor JL,
      4. et al
      . Development and evaluation of a simulation-based pediatric emergency medicine curriculum. Acad Med. 2009;84(7):935–941.
      OpenUrlCrossRefPubMedWeb of Science
    7. 7.
      1. Sudikoff SN,
      2. Overly FL,
      3. Shapiro MJ
      . High-fidelity medical simulation as a technique to improve pediatric residents’ emergency airway management and teamwork: a pilot study. Pediatr Emerg Care. 2009;25(10):651–656.
      OpenUrlCrossRefPubMedWeb of Science
    8. 8.↵
      1. Mayo PH,
      2. Hackney JE,
      3. Mueck JT,
      4. Ribaudo V,
      5. Schneider RF
      . Achieving house staff competence in emergency airway management: results of a teaching program using a computerized patient simulator. Crit Care Med. 2004;32(12):2422–2427.
      OpenUrlCrossRefPubMedWeb of Science
    9. 9.↵
      1. Andreatta P,
      2. Saxton E,
      3. Thompson M,
      4. Annich G
      . Simulation-based mock codes significantly correlate with improved pediatric patient cardiopulmonary arrest survival rates. Pediatr Crit Care Med. 2011;12(1):33–38.
      OpenUrlCrossRefPubMed
    10. 10.↵
      1. Smith CC,
      2. Huang GC,
      3. Newman LR,
      4. et al
      . Simulation training and its effect on long-term resident performance in central venous catheterization. Simul Healthc. 2010;5(3):146–151.
      OpenUrlCrossRefPubMed
    11. 11.↵
      1. Rudolph JW,
      2. Simon R,
      3. Rivard P,
      4. Dufresne RL,
      5. Raemer DB
      . Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesthesiol Clin. 2007;25(2):361–376.
      OpenUrlCrossRefPubMed
    12. 12.↵
      1. Cheng A,
      2. Eppich W,
      3. Grant V,
      4. Sherbino J,
      5. Zendejas B,
      6. Cook DA
      . Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ. 2014;48(7):657–666.
      OpenUrlCrossRefPubMed
    13. 13.↵
      1. Barnsley L,
      2. Lyon PM,
      3. Ralston SJ,
      4. et al
      . Clinical skills in junior medical officers: a comparison of self-reported confidence and observed competence. Med Educ. 2004;38(4):358–367.
      OpenUrlCrossRefPubMedWeb of Science
    14. 14.↵
      1. Cook DA,
      2. Hatala R,
      3. Brydges R,
      4. et al
      . Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978–988.
      OpenUrlCrossRefPubMedWeb of Science
    15. 15.↵
      1. McGaghie WC,
      2. Draycott TJ,
      3. Dunn WF,
      4. Lopez CM,
      5. Stefanidis D
      . Evaluating the impact of simulation on translational patient outcomes. Simul Healthc. 2011;6(suppl):S42–S47.
      OpenUrlCrossRefPubMed
    16. 16.↵
      1. Steinemann S,
      2. Berg B,
      3. Skinner A,
      4. et al
      . In situ, multidisciplinary, simulation-based teamwork training improves early trauma care. J Surg Educ. 2011;68(6):472–477.
      OpenUrlCrossRefPubMed
    17. 17.
      1. Berkenstadt H,
      2. Ben-Menachem E,
      3. Simon D,
      4. Ziv A
      . Training in trauma management: the role of simulation-based medical education. Anesthesiol Clin. 2013;31(1):167–177.
      OpenUrlCrossRefPubMed
    18. 18.
      1. Katheria A,
      2. Rich W,
      3. Finer N
      . Development of a strategic process using checklists to facilitate team preparation and improve communication during neonatal resuscitation. Resuscitation. 2013;84(11):1552–1557.
      OpenUrlCrossRefPubMedWeb of Science
    19. 19.
      1. Carbine DN,
      2. Finer NN,
      3. Knodel E,
      4. Rich W
      . Video recording as a means of evaluating neonatal resuscitation performance. Pediatrics. 2000;106(4):654–658.
      OpenUrlAbstract/FREE Full Text
    20. 20.↵
      1. Finer NN,
      2. Rich W
      . Neonatal resuscitation: toward improved performance. Resuscitation. 2002;53(1):47–51.
      OpenUrlCrossRefPubMedWeb of Science
    21. 21.↵
      1. Theilen U,
      2. Leonard P,
      3. Jones P,
      4. et al
      . Regular in situ simulation training of paediatric medical emergency team improves hospital response to deteriorating patients. Resuscitation. 2013;84(2):218–222.
      OpenUrlCrossRefPubMedWeb of Science
    22. 22.↵
      1. Gaies MG,
      2. Landrigan CP,
      3. Hafler JP,
      4. Sandora TJ
      . Assessing procedural skills training in pediatric residency programs. Pediatrics. 2007;120(4):715–722.
      OpenUrlAbstract/FREE Full Text
    23. 23.↵
      1. Simons DJ
      . Unskilled and optimistic: overconfident predictions despite calibrated knowledge of relative skill. Psychon Bull Rev. 2013;20(3):601–607.
      OpenUrlCrossRefPubMed
    • Copyright © 2015 by the American Academy of Pediatrics
    View Abstract
    PreviousNext

     

    Advertising Disclaimer »

    View this article with LENS
    PreviousNext
    Email

    Thank you for your interest in spreading the word on Hospital Pediatrics.

    NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

    Enter multiple addresses on separate lines or separate them with commas.
    Evaluating Simulation Education Via Electronic Surveys Immediately Following Live Critical Events: A Pilot Study
    (Your Name) has sent you a message from Hospital Pediatrics
    (Your Name) thought you would like to see the Hospital Pediatrics web site.

    Alerts
    Sign In to Email Alerts with your Email Address
    Citation Tools
    Evaluating Simulation Education Via Electronic Surveys Immediately Following Live Critical Events: A Pilot Study
    Corinne Savides Happel, Meredith A. Lease, Akira Nishisaki, Matthew S. Braga
    Hospital Pediatrics Feb 2015, 5 (2) 96-100; DOI: 10.1542/hpeds.2014-0091

    Citation Manager Formats

    • BibTeX
    • Bookends
    • EasyBib
    • EndNote (tagged)
    • EndNote 8 (xml)
    • Medlars
    • Mendeley
    • Papers
    • RefWorks Tagged
    • Ref Manager
    • RIS
    • Zotero
    Share
    Evaluating Simulation Education Via Electronic Surveys Immediately Following Live Critical Events: A Pilot Study
    Corinne Savides Happel, Meredith A. Lease, Akira Nishisaki, Matthew S. Braga
    Hospital Pediatrics Feb 2015, 5 (2) 96-100; DOI: 10.1542/hpeds.2014-0091
    del.icio.us logo Digg logo Reddit logo Technorati logo Twitter logo CiteULike logo Connotea logo Facebook logo Google logo Mendeley logo
    Print
    PDF
    Insight Alerts
    • Table of Contents
    • Early Release
    • Current Issue
    • Past Issues
    • Collections
    • Author Guidelines
    • Editorial Board
    • Editorial Policies
    • LENS
    • Overview
    • Reviewer Guidelines
    • Submit My Manuscript

    Subjects

    • Allergy/Immunology
      • Allergy/Immunology
    Back to top

             

    Copyright (c) 2018 by American Academy of Pediatrics

    International Access »

    Terms of Use
    Privacy Statement
    FAQ

    AAP Pediatrics