The Association of University Professors in Ophthalmology (AUPO) and International
Council of Ophthalmology (ICO) Taskforce have standards for ophthalmic proficiency
expected of all graduating medical students. These standards highlight several fundamental
components of the eye examination, including proficiency in measuring visual acuity;
detecting abnormal eye movements, pupillary responses, and ocular surface abnormalities;
and performing direct ophthalmoscopy to evaluate the optic nerve head and fundus.[1]
[2] Despite these recommendations, both medical students and practicing nonophthalmologists
infrequently perform a basic eye examination or do so poorly.[3]
[4]
While several barriers may contribute to this problem, one of the most apparent obstacles
is the difficulty of teaching procedural skills in a medical student curriculum. Especially
in a technically heavy field like ophthalmology, proper training with unique instruments
is needed for one to evaluate symptoms, dictate initial management, or even refer
patients with some of the most basic ophthalmic problems.[5]
[6]
[7] Considerable investments in equipment, instructors, and time are required to train
medical students adequately. Nonetheless, it is critical for medical school graduates
to learn an accurate and thorough eye examination, as several sight- and life-threatening
conditions can be discerned by the proper implementation of these skills.[3]
[8]
In response to this need, we designed and implemented an ophthalmology clinical skills
training session that was integrated with a preexisting 1-week long ophthalmology
preclinical course. In this study, we evaluate the effectiveness and utility of this
one-time skills workshop, which may serve as a practical teaching model. We assess
students' reported self-confidence with eye examination skills, gauge their preferences
for the traditional direct or PanOptic ophthalmoscope, and explore long-term retention
of fundoscopy skills in an objective assessment on standardized patients. We used
this information to suggest changes to nonophthalmology fundoscopy standard practices.
Methods
Study Population
Participants were second-year medical students at the University of Miami Miller School
of Medicine. Inclusion criteria required participation in the preclinical ophthalmology
course, completion of a presession quiz, and attendance at the small-group skills
training session. The University of Miami Institutional Review Board determined that
this study meets the criteria for an exemption as described in Federal Regulation
45 CFR 46.104.
Overview of Protocol
This small-group training session was integrated with the second-year medical students'
1-week, preclinical ophthalmology course that was held in July 2019. The training
was conducted in a 90-minute session midway through the course. Each group was led
primarily by fourth-year medical students entering ophthalmology along with ophthalmology
residents, fellows, and faculty from Bascom Palmer Eye Institute (BPEI). Fourth-year
medical students were selected to be trainers by an attending ophthalmologist prior
to the training session after having demonstrated proficiency in performing and teaching
a comprehensive eye examination. Pre- and postsurveys on students' confidence in performing
each part of the eye examination were administered. A cohort of students in our combined
MD/MPH dual degree program that underwent end of year clinical competency exercises
in March 2020 (8 months after the initial session) were assessed on their direct ophthalmoscopy
skills using both the PanOptic ophthalmoscope and the traditional direct ophthalmoscope.
Presession
Medical students viewed presession instructional videos and completed an accompanying
quiz. Students also completed a survey asking them to report their self-confidence
in each aspect of the eye examination according to a 6-point Likert scale.
Small-Group Training
The second-year medical students were divided into 24 groups of eight to nine students.
Groups were primarily led by fourth-year medical students who were trained and selected
by an ophthalmology faculty member (C. R. A.). BPEI residents, fellows, and attendings
led some groups and were “floating” instructors. The training session lasted 90 minutes.
Each session began with 5 minutes of standardized instruction on how the training
was organized.
The training sequence consisted of key components of the eye examination ([Fig. 1]). The training sequence was split into two portions. In one portion, 40 minutes
were allotted for training in external eye inspection, visual acuity, extraocular
motility assessment, confrontation visual fields assessment, and intraocular pressure
(IOP) measurement. In the other portion, 40 minutes were allotted for training in
pupillary light reflex exam, corneal staining, and fundoscopic examination via direct
ophthalmoscopy. Half of the groups started with one portion of the training, while
the second half started with the other due to equipment limitations. Fluorescein sodium
sterile ophthalmic strips 0.6 mg (Akron, Lake Forest, IL) were used. Students were
trained with the Tono-Pen tonometer (Reichert Technologies, Depew, NY) for IOP measurement.
Students were trained with traditional direct ophthalmoscopes and PanOptic ophthalmoscopes
with and without an iExaminer adapter (Welch Allyn, Skaneateles Falls, NY).
Fig. 1 Training sequence flowsheet. CVFA, confrontational visual field assessment; EOMA,
extraocular motility assessment; IOP, intraocular pressure; VA, visual acuity.
Students practiced administering these elements of the eye examination on each other
in pairs. Students were taught how to properly calibrate and use the Tono-Pen tonometer
for measuring IOP. For direct ophthalmoscopy training, eye models provided by Welch
Allyn were used when needed. Students were given equal time to practice on each other
with traditional direct ophthalmoscopes and PanOptic ophthalmoscopes.
Postsession
The final 10 minutes of the session were dedicated to feedback. Students who completed
the presession survey and attended the small-group training were asked to complete
a postsession survey that collected information on their confidence with each element
of the eye examination that was reviewed. They were also queried regarding which instrument
they preferred for learning direct ophthalmoscopy. A 6-point Likert scale was used
again.
Long-Term Follow-Up
Optic nerve photographs of two standardized patients were taken at the BPEI Imaging
Center. Separate multiple-choice quizzes were created for each standardized patient.
Quizzes contained five options: four optic nerve photographs and an “unable to observe
the optic nerve” option. Students who previously completed the training session 8
months prior were instructed to visualize a standardized patient's fundus with a traditional
direct ophthalmoscope and another standardized patient's fundus with a PanOptic ophthalmoscope.
Each standardized patient had pupillary diameter measured in dim light to be approximately
4 mm. Students then completed the quiz for each patient immediately following fundus
examination. This follow-up was held in March 2020 during their year-end clinic competency
exercises.
Statistics
The paired Student's t-test was used to analyze the difference between pre- and postsession survey responses.
The independent Student's t-test was used to compare preferences between the traditional direct and PanOptic
ophthalmoscopes. All analyses were performed with consultation from the University
of Miami Data Services team.
Results
Of the 197 second-year medical students who participated in our clinical skills training
session, 172 students (87.3% response rate) completed the presession survey and 108
students (54.8% response rate) completed the postsession survey. Participating students
reported increased confidence in all components of the eye examination that were reviewed
(2.80 vs. 4.52, p < 0.01), reflected by a change from “uncomfortable/neutral” to “extremely comfortable/comfortable.”
They reported the greatest absolute change in mean confidence with fluorescein staining
of the cornea following the training session (0.67 vs. 4.31, p < 0.001). Students also reported increased confidence with observing the optic nerve
with the PanOptic ophthalmoscope (1.21 vs. 4.48, p < 0.001) and measuring IOP (1.62 vs. 4.25, p < 0.001). Students reported the smallest absolute change in mean confidence with
the pupillary exam after the training session (3.77 vs. 4.61, p < 0.001; [Fig. 2]).
Fig. 2 Graphic representation of the mean student self-confidence levels in individual components
of the eye examination using a paired one-tailed Student's t-test. A 6-point Likert scale was used: (0) never tried, (1) very uncomfortable, (2)
uncomfortable, (3) neutral, (4) comfortable, and (5) extremely comfortable. *denotes
significance of p-value <0.01.
Overall, students were satisfied with the training session and perceived ophthalmology
clinical skills to be important in their future medical careers. Most students were
“extremely satisfied/satisfied” (mean score of 4.71) with the training that they received.
A total of 107 out of 108 (99.1%) students reported that they visualized the optic
nerve with either the traditional direct or PanOptic ophthalmoscope. Students “strongly
agreed/agreed” (mean score of 4.64) that fundamental eye examination skills will be
beneficial for the care of their future patients and that they would “absolutely/occasionally”
(mean score of 4.59) use their improved eye examination skills in their clinical practice
([Table 1]).
Table 1
Student perception of the small-group ophthalmology clinical skills training session
Statement/question
|
Mean presession response (SD)
|
Mean postsession response (SD)
|
(a) A working knowledge of the fundamentals of the eye exam is beneficial for the
care of my patients (n = 172)
|
4.64 (±0.55)
|
|
(b) Will skills today help in future practice? (n = 108)
|
|
4.59 (±0.58)
|
(c) Satisfaction with today's session (n = 108)
|
|
4.71 (±0.67)
|
Question
|
Yes
|
No
|
Did you see the optic nerve today? (n = 108)
|
107
|
1
|
Abbreviation: SD, standard deviation.
Students were asked about their opinion on the questions listed above:
(a) 1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly agree.
(b) 1 = not at all, 2 = possibly, 3 = neutral, 4 = occasionally, 5 = absolutely.
(c) 1 = very unsatisfied, 2 = unsatisfied, 3 = neutral, 4 = satisfied, 5 = extremely
satisfied.
Students also reported a clear preference for the PanOptic ophthalmoscope. Before
the session, most students stated that they were more confident using the traditional
direct ophthalmoscope than the PanOptic ophthalmoscope (2.30 vs. 1.16, p < 0.001). However, after training with both instruments, the vast majority of students
expressed greater confidence for visualizing the fundus and optic nerve head with
the PanOptic ophthalmoscope (4.49 vs. 3.93, p < 0.001). A total of 80 out of 85 students (94.1%) preferred the PanOptic ophthalmoscope
over the traditional direct ophthalmoscope ([Table 2]).
Table 2
Traditional direct ophthalmoscope versus PanOptic ophthalmoscope
|
|
Traditional direct
|
PanOptic
|
Difference
|
Fold difference
|
p-Value
|
(a)
|
Mean presession confidence level (SD)a
n = 172
|
2.16 (±1.16)
|
1.21 (±1.37)
|
–0.95
|
0.51
|
7.09E-08
|
|
Mean postsession confidence level (SD)a
n = 108
|
3.93 (±1.02)
|
4.49 (±0.64)
|
0.55
|
1.14
|
2.91E-06
|
(b)
|
Which did you find easier to use? (number of students)
|
5
|
80
|
|
|
|
Abbreviation: SD, standard deviation.
(a) Mean confidence levels between the traditional direct ophthalmoscope and PanOptic
ophthalmoscope are compared using an independent one-tailed Student's t-test. Students completed a survey to reflect their respective confidence levels (0 = never
tried, 1 = very uncomfortable, 2 = uncomfortable, 3 = neutral, 4 = comfortable, 5 =
extremely comfortable). Significant differences in confidence levels are reflected
between the traditional direct and PanOptic ophthalmoscopes groups as noted in the
pre- and postsession groups. Difference column denotes numeric change relative to
PanOptic ophthalmoscope mean confidence level. adenotes p-value <0.001.
(b) Number of responses from students who preferred the traditional direct versus
PanOptic ophthalmoscope are noted.
A subset of the second-year class (MD/MPH dual degree students) that completed end
of year competency exercises in March 2020 were assessed on their direct ophthalmoscopy
skills using both the PanOptic ophthalmoscope and the traditional direct ophthalmoscope.
These students did not have a formal review session of this exam technique since the
initial session 8 months prior. Based on the multiple-choice quiz results following
fundus examination on standardized patients, students demonstrated greater accuracy
and ease with the PanOptic ophthalmoscope than with the traditional direct ophthalmoscope.
With the PanOptic ophthalmoscope, 24 (24/42; 57.1%) students identified the correct
fundus image, 17 (17/42; 40.5%) students selected an incorrect image, and 1 (1/42;
2.4%) student could not find the optic nerve. With the traditional direct ophthalmoscope,
4 (4/42; 9.1%) students identified the correct fundus image, 10 (10/42; 23.8%) students
selected the incorrect image, and 28 students (28/42; 66.7%) could not find the optic
nerve ([Table 3]).
Table 3
Long-term retention of ophthalmoscopy skills with the traditional direct ophthalmoscope
and PanOptic ophthalmoscope
|
n = 42
|
Traditional direct
|
PanOptic
|
(a)
|
Correct responses (%)
|
4 (9.5)
|
24 (57.1)
|
|
Incorrect responses (%)
|
10 (23.8)
|
17 (40.5)
|
|
Did not visualize optic nerve (%)
|
28 (66.7)
|
1 (2.4)
|
(b)
|
Students who visualized optic nerve (%)
|
14 (33.3)
|
41 (97.6)
|
(a) Number of students who correctly, incorrectly, or could not visualize the optic
nerve in a multiple-choice quiz following examination of standardized patients.
(b) Number of students who affirm visualization of the optic nerve following fundus
assessment of standardized patient.
Discussion
The Importance of Clinical Ophthalmology Skills Training
Conducting an accurate and thorough eye examination is critical to the diagnoses of
several sight- and life-threatening medical conditions.[3]
[8] As gatekeepers to managed care, primary care physicians must be proficient in basic
ophthalmology skills to detect signs of emergent vision loss, screen for retinal degenerative
disease, and evaluate systemic and microvascular disease (hypertension, diabetes,
etc.).[7]
[9] Emergency medicine physicians commonly perform instrument-based components of the
eye examination, such as tonometry, fluorescein staining, and ophthalmoscopy, to assess
for ocular trauma and acute retinopathy.[10] A well-performed eye examination can reveal key diagnostic and prognostic findings
of systemic diseases that are managed among many specialties.
There is a growing consensus that medical school graduates and nonophthalmology providers
should be competent in key elements of the eye examination. The Association of American
Medical Colleges, AUPO, the American Academy of Ophthalmology, and the ICO emphasize
that medical students, at minimum, should be able to perform a basic eye examination
with fundoscopy and describe their observations proficiently.[1]
[11]
[12] Medical students should also be able to visualize the red reflex, the retina, and
optic disc; assess the optic disc for cupping, color, contour, margins, vessels, and
edema; and recognize changes associated with glaucoma and macular degeneration.[1]
Given that many unique instruments and imaging modalities are used to validate symptoms
and reach diagnoses, substantial hands-on time is necessary for medical students to
gain adequate knowledge, comfort, and confidence with these technical skills. However,
ophthalmology education has become deprioritized in many medical school curriculums.
Dedicated ophthalmology mean curriculum hours and mandated clinical teaching steadily
declined in the past several years.[13] As a result, many medical school graduates are inadequately trained and lack confidence
in their ability to perform these important clinical skills upon entering residency
training.[5]
[6]
[7]
It is reassuring to note that our medical students share the same sentiment for the
importance of ophthalmology clinical skills in the future care of their patients ([Table 1]). While students understand the gravity with which these skills can affect patient
care, it is the utmost duty of medical educators to provide the time, resources, and
comprehensive training for their students to obtain the knowledge and skills that
they seek.
Reflections and Evaluation of Our Small-Group Clinical Training Session
Our one-time, small-group training session was effective in teaching our medical students
fundamental components of the eye examination in a concise 90-minute session. In all
components of the eye examination that were reviewed, students' confidence improved
([Fig. 2]). The greatest increase in reported confidence occurred with fluorescein staining
of the cornea. This most likely occurred because students had the least amount of
prior exposure to this skill. Low presession confidence levels directly correlate
with the degree of unfamiliarity with both the instrument and technique, and the presession
confidence was lowest in this component of the eye examination (mean score of 0.67,
“very uncomfortable/never tried”). Other components of the eye examination with low
presession confidence levels (mean score < 2.00) and hence lower degrees of familiarity,
include observing the optic nerve head with the PanOptic ophthalmoscope and measuring
IOP (mean score of 1.21 and 1.62, respectively).
Certain presession confidence levels are higher than expected because some medical
students are exposed to ophthalmology clinical skills through our Department of Outreach
and Community Services Health Fairs. These community health fairs include an ophthalmology
station, where students conduct visual acuity screening, frequency doubling technology
visual field testing, tonometry, and corneal pachymetry. Medical students attend a
mandatory 30-minute skills session before each health fair to review these eye examination
skills. They do not receive training in direct ophthalmoscopy as only physicians perform
the fundus examination at the health fairs. Another reason for higher presession confidence
levels than expected may be explained by a noncomprehensive understanding of an examination
component. For example, a proper external inspection of the eyes and eyelids includes
several different components such as noting eyelid lesions, proptosis, eyelid malpositions
to name a few. Students may have felt confident in their perception of a thorough
external eye examination without realizing that they may have omitted other important
components of this particular examination. Hence, students could have falsely reported
a higher level of confidence than what should have been attributed.
While the educational benefits of our ophthalmology clinical training session are
reflected by our responses from second-year medical students, upperclassman medical
students (primarily fourth-year students) who were small-group trainers also reaped
the benefits of solidifying their own skills and further developing their capacity
as medical educators. The “see one, do one, teach one” teaching methodology has been
commonly used among medical trainees to gain proficiency in procedural skills and
techniques.[14] This adage reflects the traditional method of procedural teaching, where a trainee—after
adequately observing a procedure—is expected to perform this procedure and eventually
teach another trainee how to do the same. This teaching model is reflected in our
clinical skills training design, where our upperclassmen trainers at one point observed,
performed, and now teach these examination skills to lowerclassmen trainees. The capacity
to teach another individual to perform a skill reflects the highest order of proficiency.
By utilizing skilled upperclassmen to train underclassmen in our training session,
we helped the upperclassmen advance their technical skills and develop the next generation
of future trainers as part of a continual learning cycle.
Traditional Direct versus PanOptic Ophthalmoscopy
Of all the eye examination components in our study, students reported the lowest postsession
confidence level (mean score of 3.93, “comfortable/neutral”) in observing the optic
nerve head with the traditional direct ophthalmoscope ([Fig. 2]). We suspect that this due to the technical difficulties, many face with using this
instrument. While there is a consensus that medical students and primary care providers
should be proficient in using an ophthalmoscope, there is debate over how much proficiency
is required. With a viewing field of 5 to 10n degrees and a 15× magnification, the
direct ophthalmoscope makes it challenging to obtain a proper visualization of the
optic nerve head, not to mention regions away from the nerve.[15] Evidence suggests that the traditional direct ophthalmoscope, even when used by
ophthalmologists, is inadequate in identifying common retinal abnormalities. In a
study comparing the traditional direct ophthalmoscope and gold standard (seven field
fundus photography), direct ophthalmoscope examinations by comprehensive ophthalmologists
and retina specialists agreed with the gold standard in only 52 and 70% of cases,
respectively.[16]
The alternatives, multifield fundus photography or indirect ophthalmoscopy, are either
much costlier or difficult to carry out in a nonophthalmology practice. However, the
PanOptic ophthalmoscope demonstrates promise as an alternative to the traditional
direct ophthalmoscope. The PanOptic ophthalmoscope is a “direct” ophthalmoscope but
has a wider field of view (25 degrees) and is more user-friendly than the traditional
direct ophthalmoscope. The vast majority of our medical students preferred the PanOptic
ophthalmoscope (80/85; 94.1%) compared with the traditional direct ophthalmoscope.
Students provided several comments regarding their strong preference for the PanOptic
ophthalmoscope, including its larger field of view, easier maneuverability, stability,
easier focus, further distance from patient, and clearer visualization of vessels
and optic nerve head.
Students also exhibited superior retention of technical fundoscopy skills with the
PanOptic ophthalmoscope in a long-term objective assessment on standardized patients
([Table 3]). While most students (24/42; 57.1%) identified the correct optic nerve image with
the PanOptic ophthalmoscope, 66.7% (28/41) of the students could not even visualize
the optic nerve with the traditional direct ophthalmoscope. Since a majority of students
do not retain their skills with the traditional direct ophthalmoscope, teaching fundoscopy
with this instrument may be less effective or may require more time in dedicated training.
Students will be unable to assess crucial ophthalmic emergencies or identify essential
findings associated with commonly managed diseases in their future clinical practices.
Since skills retention with the PanOptic ophthalmoscope is better, focus in future
sessions can be devoted to reviewing important clinical evaluation skills and pathology.
Furthermore, given the greater ease the PanOptic ophthalmoscope affords, physicians
who do not regularly perform ophthalmic exams may be inclined to do so more often.
For these reasons, we believe that the PanOptic ophthalmoscope should replace the
traditional direction ophthalmoscope not only in the medical education setting but
also in nonophthalmologists' clinical practices.
Long-Term Retention of Ophthalmoscopy Skills
Students were not given the opportunity to review or practice with either ophthalmoscope
before the follow-up assessment. They were markedly more accurate and comfortable
using the PanOptic ophthalmoscope than the traditional direct ophthalmoscope ([Table 3]). Our results also demonstrate that competency in technical skill does not necessarily
correlate with clinical competency. Apart from the differences in student accuracy
between the two ophthalmoscopes, there was a large discrepancy between subjective
student responses affirming visualization of the optic nerve and the objective assessment
on standardized patients. In the follow-up student cohort, 41 (41/42; 97.6%) students
visualized the optic nerve with the PanOptic ophthalmoscope, demonstrating excellent
retention of technical capability. Yet, only 24 (24/41; 58.5%) of these students identified
the correct optic nerve image.
While the majority of these students were accurate with the PanOptic ophthalmoscope,
the discrepancy in clinical competency is concerning. One obvious explanation for
this discrepancy is the extended 8-month duration with which most students did not
have any ophthalmology exposure, resulting in students forgetting key features of
a baseline fundus assessment (cup, color, contour, margins, and vessels). Another
contributing factor may be the differences in magnification between the ophthalmoscope
view and optic nerve images in the multiple-choice quiz. The traditional ophthalmoscope
view is considerably more magnified, requiring a bit of maneuvering to see the entire
nerve and visual memory to compare what they see to the quiz image options which show
the entire nerve in one photo; thus, students might not have evaluated enough necessary
features to select the matching image. There is always the possibility that students
simply guessed without actually visualizing the optic nerve on the standardized patients
even though there was an option to select for those who could not visualize it. Despite
these contributing factors, the conclusion remains that clinical assessment skills
must be better integrated and reviewed with technical instruction.
Limitations and Future Directions
This study only evaluated a subset of medical students' clinical competency objectively
and focused on self-reported confidence levels in performing each component of the
eye examination for the majority of students. While perceived skill and self-confidence
do not necessarily reflect actual skill proficiency, we believe that these factors
are essential in mediating improved skills in future clinical training. Incorporating
a comprehensive objective evaluation with a trained evaluator for all aspects of the
eye examination would demonstrate both immediately after and several months after
the session will provide a more accurate assessment of training success and skill
retention.
Our study also did not evaluate students' ability to discern ocular pathology. The
utility of learning examination skills is marginal without the ability to translate
these skills to identifying pathology. Incorporating clinical scenarios through simulation
or standardized patients may help reinforce the pathology that students learn in their
concurrent ophthalmology preclinical course.
Other limitations of our study are the lower response rate of the postsession survey
and the low student participation in the long-term follow-up exercise on standardized
patients. These limitations may have influenced postsession results interpretation.
The response rate was lower because the postsession survey was optional and did not
accompany a mandatory quiz unlike the presession survey. While MD/MPH dual degree
students were assessed during their end of year clinical competency exercises, extenuating
circumstances forced our medical school to suspend all clinical and standardized patient
activities for the remainder of the academic year. As a result, end of year clinical
competency exercises scheduled for MD track students did not take place. However,
this limited dataset is still valuable as it includes students who did not have a
formal review session of direct ophthalmoscopy and who have a variety of different
clinical interests.
Our study shows that students who participated in our course had an overall improvement
in their comfort and confidence with the eye examination. Our course structure is
succinct and sustainable given the small-group, peer-led format. This session can
be incorporated into other medical institutions without major disruptions to their
curriculums and may be able to better prepare their medical school students to identify
basic ophthalmologic disease and emergencies.
Our study also reveals students' overwhelming preference for the PanOptic ophthalmoscope.
We believe that the PanOptic ophthalmoscope should be further integrated into medical
education and clinical practice. Given the dwindling hours dedicated to ophthalmology
education, it would be more effective to train students with a tool that is easier
to use and enables improved skills retention. However, these ophthalmoscopes are not
widely available in most medical schools, clinical practices, or hospitals and may
otherwise be economically burdensome. A major hurdle is challenging the norm of using
the traditional direct ophthalmoscope. We believe that a shift in the standard of
direct funduscopic visualization in the primary care setting is warranted and should
be proposed for discussion. A study that collects data on the frequency with which
primary care providers use the traditional direct ophthalmoscope in their clinical
practice may shine light on the utility of this instrument and provide further impetus
for change to other instruments like the PanOptic ophthalmoscope.