Skip to main page content

Development and evaluation of self-, peer-, and expert-assessments in video-recorded simulated cataract surgery by ophthalmology residents for competency by design (CBD) training

Theme:
Hot Topic
What:
Paper Presentation | Présentation d'article
When:
4:32 PM, Saturday 2 Jun 2018 (3 minutes)
How:

Z HOT TOPIC Z 


Authors: Stephanie Cheon, Cornelis de Jager, Rylan Egan, Mark Bona, Christine Law
Author Disclosure Block: S. Cheon: Grant/research support; Name of Commercial Company(s); Maudsley Scholarship and Research Fund. C. de Jager: Grant/research support; Name of Commercial Company(s); Maudsley Scholarship and Research Fund. R. Egan: Grant/research support; Name of Commercial Company(s); Maudsley Scholarship and Research Fund. M. Bona: Grant/research support; Name of Commercial Company(s); Maudsley Scholarship and Research Fund. C. Law: Grant/research support; Name of Commercial Company(s); Maudsley Scholarship and Research Fund.

Abstract Body:

Purpose: The shift of medical education in Canada to CBD has created a need to develop more structured simulation-based surgical training curricula and evaluation tools for ophthalmology trainees. The currently available validated ophthalmology surgical assessment tools are predominantly for intraoperative cases, and do not have competency targets nor utilize self-/peer-assessment. In our pilot study, we sought to develop a valid evaluation tool and identify competency targets for cataract surgery in a simulated environment for trainees.

Study Design: Creation of an assessment tool by conventional Delphi method and prospective validation study of the tool by self, peers, and experts.

Methods: The study was carried out in two phases. Phase 1: An assessment tool comprising of: a) four procedural items scored from 0 (not performed) to 7 (competent), and b) a global rating scale (GRS) requiring yes/no answers to four performance-related questions, was established by conventional Delphi method. The procedural items were incision and paracentesis (formation and technique), viscoelastic (appropriate use and safe insertion), capsulorrhexis (commencement of flap and follow-through), and capsulorrhexis (formation and circular completion). Nine cataract surgery experts provided feedback and modifications in two rounds. Phase 2: Eight ophthalmology residents novice to cataract surgery completed ten simulated surgeries. Two staff ophthalmologists graded the masked videos using the assessment tool. The eight residents graded their own ten videos and ten of their peers' videos in sequential order.

Results: Phase 1: The first round of the Delphi method involved yes/no answers to questions related to the tool; agreement ranged from 55.56% to 100%. The second round used a Likert-type scale to look for conformity between answers. Agreement was 80% or greater (strongly agree or agree) for all answers and improved from the first round. Phase 2: The mean expert score ranged from 3.17±0.86 to 4.67±1.23 for all procedural items across attempts and improved between the first and tenth attempt for each of the four items. There was a general trend towards improved GRS competency and faster surgery completion times by the tenth attempt, but skill proficiency was not necessarily attained. The difference between mean expert and self scores, and between mean expert and peer scores for all procedural items across attempts ranged from -0.48±0.78 to 0.56±0.23 and -0.30±0.17 to 1.11±1.26, respectively. Overall expert scores were higher than overall self and peer scores in three of the four items. Overall incision and paracentesis and viscoelastic scores were consistently higher than overall capsulorrhexis (commencement of flap and follow-through) and capsulorrhexis (formation and circular completion) scores for the expert-, self-, and peer-assessments.

Conclusions: Our study helps provide expectations of the pre-surgical learner and guide objective competency-based assessment. The development of a refined assessment tool and the potential for residents to self-assess and provide peer feedback as an adjunct to expert feedback should be further investigated as not only a method to improve trainee performance, but also lifelong learning skills.
Session detail
Allows attendees to send short textual feedback to the organizer for a session. This is only sent to the organizer and not the speakers.
To respect data privacy rules, this option only displays profiles of attendees who have chosen to share their profile information publicly.

Changes here will affect all session detail pages