Development of an Objective Structured Clinical Examination to Assess Medical Student Competence in the Ocular Examination
23 December 2014 (online)
Purpose Competently performing the ocular examination is an essential skill of every graduating medical student. Assessment of competence in performing the ocular examination is equally essential. Unfortunately, there is a paucity of tools useful in assessing clinical competence in ophthalmology. The Objective Structured Clinical Examination (OSCE) format has been shown to be a useful tool to assess clinical competence in other academic settings. It is hypothesized that the OSCE format can be used to assess clinical competence in performing the ocular examination. This study presents the novel use of the OSCE format for assessing competence in obtaining a history of headache and performing the ocular examination by third and fourth year medical students.
Methods Observational design was used to assess the competence of third and fourth year medical students in taking a brief history on the chief complaint of headache and performing the ocular examination. The ophthalmology OSCE was used after a one-week ophthalmology clerkship at the University of Wisconsin-Madison School of Medicine and Public Health. Standardized patients were trained by ophthalmology staff prior to the OSCE. The standardized patients were not dilated for the direct ophthalmoscope portion of the examination.
Results Students were graded on their performance in obtaining a history of headache and performing the ocular examination. The ocular examination included assessment of ocular motility, pupils, using confrontation to find a visual field defect, and using the direct ophthalmoscope to match a standardized patient's optic nerve to one of four photos in the room. A check box system was used to assess the student's competence. This was reported as pass, marginal, or fail. Failing students were remediated by an ophthalmologist. A total of 384 students took the exam from 2008 to 2012. Overall, 84% received a passing score, 11% received a marginal score, and 5% received a failing score.
Conclusion Establishment of competence and assessment of competence continue to be a major focus of medical student education. The OSCE format provides a standardized testing platform to assess competence with performing the ocular examination. Use of standardized patients provides a more natural format than paper-based or other methods of assessment.
- 1 Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. BMJ 1975; 1 (5955) 447-451
- 2 Jain SS, Nadler S, Eyles M, Kirshblum S, DeLisa JA, Smith A. Development of an objective structured clinical examination (OSCE) for physical medicine and rehabilitation residents. Am J Phys Med Rehabil 1997; 76 (2) 102-106
- 3 Jones TV, Gerrity MS, Earp J. Written case simulations: do they predict physicians' behavior?. J Clin Epidemiol 1990; 43 (8) 805-815
- 4 Benbassat J, Polak BCP, Javitt JC. Objectives of teaching direct ophthalmoscopy to medical students. Acta Ophthalmol (Copenh) 2012; 90 (6) 503-507
- 5 Asman P, Lindén C. Internet-based assessment of medical students' ophthalmoscopy skills. Acta Ophthalmol (Copenh) 2010; 88 (8) 854-857
- 6 Mottow-Lippa L, Boker JR. Simulator assessment of funduscopic skills in three consecutive medical school classes. J Acad Ophthalmol 2009; 2 (1) 1-5
- 7 Tamblyn RM, Klass DJ, Schnabl GK, Kopelow ML. The accuracy of standardized patient presentation. Med Educ 1991; 25 (2) 100-109
- 8 Vu NV, Marcy MM, Colliver JA, Verhulst SJ, Travis TA, Barrows HS. Standardized (simulated) patients' accuracy in recording clinical performance check-list items. Med Educ 1992; 26 (2) 99-104
- 9 Mottow-Lippa L. Restoring Ophthalmology to the Mainstream Medical School Curriculum. J Acad Ophthalmol 2008; 1 (2) 62-64
- 10 Mottow-Lippa L. Ophthalmology in the medical school curriculum: reestablishing our value and effecting change. Ophthalmology 2009; 116 (7) 1235-1236 , 1236.e1