Objective Structured Clinical Examinations (OSCE’s) have been adopted as a means of assessing midwifery students’ clinical skills. The purpose of the OSCE is to provide a standardised method for the evaluation of clinical skill performance in a simulated environment. This paper describes how a quality improvement initiative using both internal and external expert review was utilised to improve OSCE assessment marking criteria. The purpose of the quality initiative was to review the content and face validity of the marking criteria for assessing performance. The design and choice of tools used to score students’ performance is central to reliability and validity. 20 videos of students from year one of a midwifery preregistration programme undertaking an OSCE assessment on abdominal examination and 18 videos of students response to obstetric emergencies e.g. PPH, and shoulder dystocia were available for review. The quality initiative aimed to strengthen the reliability and validity of the OSCE in assessing student performance. Conclusion: the use of global rating scales allows for the capturing of elements of professional competency that do not appear on specific criteria for skills performance checklists.
History
Publication
Elsevier;In Press
Publisher
Elsevier
Note
peer-reviewed
Rights
This is the author’s version of a work that was accepted for publication in Nurse Education Today. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Nurse Education Today, In Press doi: 10.1016/j.nepr.2012.11.006