An Observed Structured Teaching Evaluation Demonstrates the Impact of a Resident-as-Teacher Curriculum on Teaching Competency
Background and Objective: Residents play a critical role in the education of peers and medical students, yet attainment of teaching skills is not routinely assessed. The primary aim of this study was to develop a novel, skill-based Observed Structured Teaching Evaluation (OSTE) and self-assessment survey to measure the impact of a resident-as-teacher curriculum on teaching competency. The secondary aim was to determine interrater reliability of the OSTE.
Methods: A prospective study quantitatively assessed intern teaching competency via videotaped teaching encounters (videos) before and after a month-long hospital medicine rotation and self-assessment surveys over a 5-month period. The intervention group received the resident-as-teacher curriculum. Videos were evaluated by 2 blinded faculty via an OSTE covering 9 skills within 3 core components: preparation, teaching, and reflection. Pre- to post–HM rotation month differences were evaluated within and between groups using the Wilcoxon signed rank test and Wilcoxon rank-sum test, respectively.
Results: Twenty-two of 25 (88%) control and 27 of 28 (96%) intervention interns participated; 100% of participants completed the study. The intervention group’s pre-post difference for the total OSTE score and the average self-assessed competence statistically improved; however, no significant difference was seen between groups. The difference in preparation scores was significant for the intervention compared with the control. The OSTE’s interrater reliability demonstrated good agreement with weighted kappas of 0.86 for preparation, 0.71 for teaching, and 0.93 for reflection.
Conclusions: Implementation of an objective, skill-based OSTE detected observable changes in interns’ teaching competency after implementation of a brief resident-as-teacher curriculum. The OSTE’s good interrater reliability may allow standardized assessment of skill attainment over time.
Residents play a critical role in both patient care and education of peers and medical students,1–4 yet objective assessment of their teaching skills is not routinely measured. The 2011 duty-hour regulations limited the length of first-year resident shifts to 16 hours, necessitating an increase in shift-based schedules and expansion of the role of residents as teachers, especially at times of limited attending presence. Despite these educational expectations, residents face several barriers to becoming competent teachers, including lack of knowledge, time constraints of clinical responsibilities, and limited training in adult learning theory.5
A variety of resident-as-teacher curricula have been developed to address residents’ capacity to teach,6–12 yet standardized methods for evaluating residents’ teaching abilities remain limited. Recently, the Pediatric Milestone Project reframed resident educational assessment by mapping the competencies to directly observable milestones to facilitate objective tracking of learner development.13,14 Although high-quality assessment tools have been developed for the evaluation of clinical, communication, and professional skills,6–11 the majority of assessment tools related to teaching focus on satisfaction or self-assessment, without objectively measuring attainment of competence.15
Similar concerns regarding valid assessment of medical student clinical competencies were addressed through the development of the Observed Standardized Clinical Examination (OSCE).16,17 This objective, validated tool has been widely implemented to assess medical student and resident clinical performance,18–20 with some evidence indicating direct impact on clinical outcomes.21 The utility of an analogous Observed Structured Teaching Evaluation’s (OSTE)22 ability to objectively assess teaching skills has not been evaluated. The primary aim of this study was to develop a novel, skill-based OSTE and self-assessment survey to measure the impact of a targeted resident-as-teacher curriculum on teaching competency based within a month-long hospital medicine (HM) rotation. We hypothesized that the use of a resident-as-teacher curriculum would improve intern teaching abilities as measured by both the OSTE and the self-assessment survey. The secondary aim was to determine interrater reliability of the OSTE.
A controlled, prospective, pre-post educational study was performed at Cincinnati Children’s Hospital Medical Center (CCHMC) from May to October 2013. This study was approved by the CCHMC Institutional Review Board.
Study Setting and Participants
CCHMC is a large, academic, pediatric medical center with a residency program consisting of >180 trainees. Interns complete at least two 1-month rotations on the HM service. Third-year medical students from the University of Cincinnati College of Medicine spend 1 month on HM during their pediatric clerkship. The HM rotation was selected because of the abundance of teaching opportunities.
Each of the 4 HM teams is composed of 2 senior residents and 3 interns. Each intern works with a third-year medical student for the entire month. Two teams are paired to allow nightly cross-cover. The paired teams, each with 6 interns and 6 third-year medical students, were assigned to either the intervention or control group each month of the study by the research coordinator. There were 2 interns scheduled for 2 HM months during the study period. They were enrolled in the study during their first HM month as controls but were excluded from the study during their second month.
This study was divided into pre- and postmonth phases. At the beginning of the HM rotation, each intern completed their premonth self-assessment survey and a videotaped teaching encounter. At the conclusion of the HM month, each intern completed the postmonth self-assessment survey and recorded a second videotaped teaching encounter.
For each videotaped encounter, interns were asked to give a 10-minute didactic to a simulated third-year medical student. The topics, pediatric complaints commonly encountered on the HM service, were chosen in advance. Interns in both the control and intervention groups were told the topic just before the video encounter. Although the topic was different each month, it remained the same for pre- and postmonth sessions and was the same in control and intervention groups. Having the same topic for each group excluded differences in subject matter as a potential confounder and allowed for more reliable assessment of interval change in teaching competency. All videos were filmed by a research assistant, paired, deidentified, and stored in a secure database for later review by the OSTE evaluators. Surveys were collected, anonymized, and entered into a RedCap database.23
All intervention group interns completed the premonth video and self-assessment survey before the study’s 1-hour educational intervention. The intervention curriculum consisted of (1) core clinical didactics: a pocket reference providing interns with information on key elements of history, physical examination, pathophysiology, epidemiology, differential diagnoses, and basic management for 10 common pediatric complaints; (2) facilitated discussion of positive and negative teaching experiences; (3) an internally developed video simulation demonstrating an interactive teaching session using principles from adult education theory; and (4) educational tips for teaching adult learners.24 The control group did not receive any training on teaching nor any of the materials distributed to the intervention group.
All pre- and postmonth videos were evaluated at the end of each month using the OSTE. Two evaluators, blinded to participant group assignment and timing of the video, scored the OSTE independently with subsequent comparison to create a consensus score. Although all faculty evaluators were members of the Division of Hospital Medicine, they were not the supervising attending physicians for the participating interns. Written feedback, including positive teaching behaviors and areas for improvement, was e-mailed to each intern after completion of the study. Interns were given an opportunity to meet with faculty investigators to discuss strategies for improvement and review their videos at the end of the study period.
Survey and OSTE Development
The pre- and postmonth surveys asked interns demographic questions and self-assessment of their general knowledge of and experience with teaching key pediatric topics, as well as perceived barriers to teaching. In the postmonth survey, interns self-assessed their perceived change in teaching competency. Both surveys used a 5-point anchored response scale for self-assessment questions.
The OSTE was developed to objectively assess teaching competency along a continuum from novice to expert allowing demonstration of progression over time. Three generally accepted subcompetencies of preparation, teaching, and reflection were used for the global structure. Each subcompetency contained 3 skills divided into 3 groupings of discretely observable teaching behaviors allowing for a score of 1 to 3 per skill (Fig 1). Interns received an overall score (range 9–27), as well as subscores (range 3–9) for each subcompetency. Although the global framework and specific skills were based on previously published work,25 the observable teaching behaviors were developed by the authors to characterize competency based on adult learning theory.24
Both the surveys and the OSTE were reviewed by educational experts in CCHMC’s Divisions of Hospital Medicine, General Pediatrics, and Education and Learning for content validity. The surveys were piloted on 6 senior residents not included in the study and were then amended accordingly. The OSTE underwent several iterative reviews and was amended for clarity and ease of use based on educational and clinical faculty input. Before study initiation, all 3 faculty evaluators reviewed multiple sample videos to discuss OSTE scoring consistency, resolve discrepancies, and modify the tool for clarification.
Demographic and baseline characteristics were compared between the control and intervention group using the 2-sample t test for age and Fisher’s exact test for categorical variables. Descriptive statistics were used to evaluate distributional properties of the pre- and postmonth survey and OSTE scores, total and subscales. Pre- to postmonth differences were evaluated within and between groups using the Wilcoxon signed rank test and Wilcoxon rank-sum test, respectively. Because of the small sample size, the analysis did not incorporate the potential clustering within medical teams or potential differences between the groups. Interrater agreement for the OSTE scores was assessed using weighted kappas for each subscore. Intraclass correlation was computed for the overall OSTE scores.
Twenty-two control and 27 intervention interns participated over a 5-month period; 4 eligible interns (3 control and 1 intervention) opted not to participate. There were no significant differences between the control and intervention groups for age, race, degrees received, and previous education training at baseline. A higher proportion in the control group compared with the intervention group agreed/strongly agreed with the statement regarding incorporation of education into future career plans (Table 1).
Teaching Competency Self-Assessment Survey
The pre- and postmonth surveys were completed by 100% of study participants. There were no significant differences between control and intervention interns’ self-assessed knowledge, experience, or teaching competency on the premonth survey. Self-assessed teaching competency significantly improved between pre- and postmonth for the intervention group but not the control group (P < .01) (Table 2). There were no significant differences in self-assessed improvement in teaching competency between control and intervention interns on the postmonth survey.
Pre- and postmonth videos were completed by 100% of study participants, with 1 premonth video terminating early due to camera malfunction. There was a significant pre-post increase in the overall OSTE scores for the intervention group (P < .01) but not the control group (Table 2). Assessing the subcompetencies independently, there was a significant increase in the postmonth OSTE score between the control and intervention groups in preparation skills (P < .05). However, there was no significant difference appreciated between the control and intervention interns’ in overall score for the postmonth OSTE (Table 2).
The interrater reliability for the OSTE demonstrated good agreement. For each of the 9 skills, the percentage perfect agreement ranged from 85% to 99%. Weighted κ values were 0.86, 0.71, and 0.93 for preparation, teaching, and reflection subscores, respectively. For the total score, the intraclass correlation was 0.93 (95% confidence interval: 0.90–0.95).
Objective assessment of resident attainment of competencies is critical for the safe and effective practice of medicine; educating patients and other medical professionals is a set of important skills for practicing pediatricians. Our OSTE, an objective, skill-based assessment tool was able to detect significant interval improvement in teaching competency after implementation of a targeted resident-as-teacher curriculum. Additionally, the OSTE demonstrated good interrater reliability.
Although residents are frequently rated by medical students as influential teachers, standardized evaluation of the quality of their teaching has been limited.12 Numerous resident-as-teacher programs have been developed to address resident-delivered feedback, formal lectures, and presentations.6–12 Our curriculum focused on developing effective microburst teaching skills, a common educational interaction between interns and medical students during inpatient rotations. Specifically, our intervention was designed to address common barriers to teaching by providing clinical content that trainees can reference and highlighting core topics of adult learning theory. We intentionally limited the curriculum’s scope to minimize the burden on participants. The study’s high completion rate demonstrates the ability to integrate this curriculum and assessment tool into an already-busy intern schedule.
The OSTE allowed objective assessment of intern teaching similar to the OSCE’s objective assessment of medical student clinical performance.18–20 The structure of the OSTE permitted directed feedback and reinforcement using a standardized set of expectations. The skill-based observable behaviors provide interns with concrete examples to differentiate levels of teaching competence, allowing a reference for self-directed improvement that can be tracked over time. Indeed, much as OSCEs track progression of clinical skills, the repeated use of an OSTE could demonstrate progression of educational skills. Whereas previous studies have used self-perceived improvement as an outcome to evaluate reliability, validity, or impact,26 this study is novel in its use of an OSTE to objectively measure interval improvement in teaching competency in response to a targeted intervention. This directly addresses the stated goals of the Pediatric Milestones Project, facilitating assessment of resident progress toward proficiency in the skill sets and characteristics of a practicing pediatrician.13,14
This study has several limitations. The study was limited to 5 months to minimize cross-contamination that can occur given the nature of intern rotations. As a result, our ability to account for the nested design and potential differences between the 2 groups was limited by our small sample size. Intervention/control assignments were done at the team level; individual interns were not randomly assigned. Interestingly, the control group had a significantly greater percentage of participants showing interest in teaching in their future career. This potentially speaks to higher motivation for learning to teach and may have contributed to the lack of difference in most postintervention comparisons. Our intervention was a single educational session designed to require minimal time; however, additional sessions may have increased retention and application of educational skills. Lastly, this was a single-center study and may not be generalizable to other training programs.
Use of a skill-based OSTE detected changes in residents’ teaching competency after implementing a resident-as-teacher curriculum as part of a HM rotation. The good interrater reliability may allow training programs to provide standardized assessment of teaching skills. The use of an objective, measurable, and reproducible skill-based OSTE to assess residents’ teaching abilities may provide training programs with the ability to track attainment of these skills, essential for a practicing pediatrician.
IN THE AUTHOR’S OWN WORDS
Effective teaching is integral to the care of hospitalized children. HM rotations offer an abundance of teaching opportunities. This study demonstrates a novel tool that detects changes in residents’ teaching competency with good reliability. Use of the tool may allow training programs to track attainment of teaching competencies over time.
The authors thank Kaitlyn Bode, research assistant, and the interns and medical student actors who participated in this study. Additionally, Drs Jennifer O’Toole, Angela Statile, and Andrew Beck were crucial to the development of the OSTE and review of this publication. Dr Zackoff received funding for this project through the APA Resident Investigator Award.
FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.
FUNDING: Dr Zackoff received an Academic Pediatric Association Resident Investigator Award to support the completion of this study.
POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.
- Cincinnati Children’s Hospital Medical Center
- hospital medicine
- Observed Structured Teaching Evaluation
- Edwards JC,
- Marier RL
- Harden RM,
- Stevenson M,
- Downie WW,
- Wilson GM
- Joorabchi B,
- Devries JM
- Gaskin PR,
- Owens SE,
- Talner NS,
- Sanders SP,
- Li JS
- Knowles MS
- Dewey M
- Copyright © 2015 by the American Academy of Pediatrics