Subscribe to RSS

DOI: 10.1055/s-0045-1812486
Performance Analysis of the Chilean National Medical Examination in Orthopedics and Traumatology
Article in several languages: español | EnglishAuthors
Abstract
Introduction
The Chilean National Medical Examination in Orthopedics and Traumatology (EMNOT) is a formative tool designed to assess the theoretical performance of graduates from national orthopedic residency programs. Although it has been administered since 2009, no prior studies have examined examinee performance by question category and subcategory.
Methods
This retrospective observational study included all questions administered in the EMNOT between 2009 and 2023. A total of 1,742 questions were classified (adult, pediatric, and basic sciences) and specific subcategories (musculoskeletal trauma, spine, shoulder and elbow, hand, knee, among others). The percentage of correct responses (PCR) per question was analyzed using fractional regression to assess differences by content area.
Results
The median PCR was 68% (IQR 49–83%), with no significant variation over time (p = 0.08), suggesting consistent exam difficulty. However, significant differences were found across categories: questions related to adult orthopedics had higher performance than those related to pediatric orthopedics (69% vs. 65%; p = 0.003). Among the subcategories, musculoskeletal trauma yielded significantly better results than those for the spine, shoulder, and elbow, hand, and knee (p < 0.05).
Conclusion
EMNOT has proven to be a stable and appropriately challenging tool for assessing theoretical knowledge in orthopedics and traumatology. Nonetheless, the observed differences in performance by thematic subcategory highlight potential gaps in academic training, particularly in pediatrics and upper extremity topics, underscoring the exam's value in guiding curricular adjustments within residency programs.
Level of Evidence
III.
Introduction
In recent decades, knowledge assessments have been implemented worldwide in residents upon graduation from their training programs in Orthopedics and Traumatology.[1] [2] [3] [4] These exams serve the purpose of objectively diagnosing the cognitive skills of graduates from various training programs, comparing their students' performance, and making informed decisions regarding whether the graduates are suitable for their professional practice.[5] [6]
In Chile, the Teaching Committee of the Chilean Society of Orthopedics and Traumatology (SCHOT) began developing a multiple-choice theoretical exam in 2007 for all residents graduating from university programs in this specialty in our country,[7] [8] with the aim of understanding and collaborating in the theoretical training of postgraduate students. The Chilean National Medical Examination of Orthopedics and Traumatology (EMNOT) was first implemented in 2009, and since then, its results have been used confidentially by universities to assess the theoretical performance of their students in various areas of the specialty.[8] Given the informative role of the exam, it is voluntary and lacks a certifying role of the specialty for graduates of Chilean university programs.[7] [8] Since 2014, this exam has assumed an accrediting role for medical specialists trained abroad, with its approval being a requirement for the validation of the degree.
Considering the significant educational impact this exam has had, it is necessary to evaluate the results of its administration to assess its validity. Although previous studies have analyzed the distribution and taxonomy of EMNOT questions[8] and evaluated discrimination and difficulty indices,[7] performance based on the percentage of correct answers in the different subject areas examined has not been described. Detecting potential performance gaps in specific subareas such as spine, pediatric orthopedics, or upper extremity could guide curricular adjustments, improve the theoretical training of residents, and strengthen the formative use of the exam. In this context, the objective of this study was to evaluate the performance of EMNOT questions by subject area and subareas during the 2009–2023 period.
Methods
Design
Retrospective observational cohort study, approved by the SCHOT Teaching Committee and the Scientific Ethics Committee of the Faculty of Medicine of the Pontifical Catholic University of Chile (resolution number 190925008).
Chilean National Medical Examination of Orthopedics and Traumatology (EMNOT)
The EMNOT is designed and administered by the SCHOT Teaching Committee, a body of professionals with recognized experience, preferably university professors with experience in medical education, selected by invitation by the committee itself or by the board of directors of the scientific society. This exam consists of 120 multiple-choice questions, prepared according to a knowledge profile agreed upon by the committee and aligned with the competencies defined in Technical Standard No. 145 for the certification of medical specialties and subspecialties.[9] This public profile was generated from the analysis of national university programs accredited by the Accrediting Agency for Postgraduate Programs in Medical Specialties and Training Centers for Medical Specialists (APICE).[10] The EMNOT questions present different levels of complexity, ranging from information recall to clinical judgment, in accordance with the taxonomy described by Buckwalter et al.[8] [11]
Sample
For this study, all exams from the 2009-2023 period were included, excluding responses from students who graduated from foreign programs. This was done to ensure greater homogeneity among the study group and to avoid potential biases associated with curricular and contextual differences. It is also important to note that no identifying data from universities and/or examinees were included in the analysis.
Variables analyzed
The unit of analysis was all questions included in the EMNOT from 2009 to 2023, with the dependent variable being the percentage of correct answers (PRC), defined as the percentage of residents who answered each question correctly.
For the analysis of the questions, an evaluator (AA) classified the questions into the thematic areas of basic and applied health sciences, adult and pediatric sciences, and the subfields of trauma, spine, miscellaneous sciences, knee, hip, ankle and foot, hand, shoulder and elbow, and oncology. For this classification, the American Orthopaedic In-Training Examination (OITE) classification was used as a reference,[3] similar to that described by Urrutia et al.[8] and currently used by the question bank of the Teaching Committee of the Chilean Society of Orthopedics and Traumatology. In case of doubt, a second specialist with more than 20 years of experience (MO) reviewed and classified the questions. The topics addressed in each knowledge area are described in [Table 1].
|
Subject areas |
Topics |
|---|---|
|
Basic and Applied Health Science |
Biomechanics, Public Health, Research, Osteoarticular Infections, Cell Biology, and Clinical Anatomy |
|
Adult |
Patients over 14 years of age |
|
Pathologies exclusive to adults (e.g., hip osteoarthritis) |
|
|
Pediatrics |
Patients under 14 years of age |
Statistical Analysis
The distribution of numerical variables was assessed using the Shapiro-Wilk test and subsequently summarized using their medians and interquartile ranges (IQR: 25%-75%). Categorical variables were described using absolute and relative frequencies.
Given that the dependent variable corresponded to a proportion between 0 and 1 (percentage of correct answers) and its distribution was not normal, fractional logistic regression was used to evaluate performance across the different subject areas and subareas.[12] Since the coefficients in this model are expressed in log odds and are not directly interpretable on the outcome scale (percentages), average marginal effects (AME) were estimated and reported. These statistics allow for the interpretation of the expected change in the proportion of correct answers associated with each category of the independent variable (subject areas and subareas), compared to a reference category.
To facilitate the interpretation of the results, the AMEs obtained from the fractional logistic regression model were transformed to a percentage scale by multiplying them by 100. This conversion allows the estimated effect to be expressed as the expected percentage change in the proportion of correct answers associated with each independent category, relative to the reference category. For example, an AME of 0.05 would be interpreted as an average increase of 5% in that proportion. In this context, negative values indicate a lower proportion of correct answers compared to the reference, while positive values represent higher performance.
P values less than 0.05 were considered significant. Statistical analyses were performed using STATA 16 software (StataCorp. 2019. Stata Statistical Software: Release 16. College Station, TX: StataCorp LLC.).
Results
A total of 1,742 questions from the EMNOT were evaluated between 2009 and 2023, with a median of 120 questions per year (range 90-120).
During the period analyzed, the distribution of questions was concentrated: 47.36% in the adult area (n = 825), 28.01% in the pediatric area (n = 488), and 24.63% in basic and applied health sciences (n = 230). The subareas with the highest proportion of questions were musculoskeletal trauma (28.47%), spine (13.32%), and miscellaneous (13.20%) ([Table 2]).
|
Subarea |
Absolute frequency |
Percentage |
|---|---|---|
|
Musculoskeletal Trauma |
496 |
28,47 |
|
Spine |
232 |
13,32 |
|
Miscellaneous |
230 |
13,20 |
|
Knee |
156 |
8,96 |
|
Hip |
153 |
8,78 |
|
Ankle and Foot |
152 |
8,73 |
|
Hand |
132 |
7,58 |
|
Shoulder and Elbow |
114 |
6,54 |
|
Orthopedic Oncology |
77 |
4,42 |
The median percentage of correct answers per question for the period was 68% (IQR [p25-p75] 49-83%). [Table 3] describes the percentages of correct answers by question according to year, area, and subareas. No differences were observed in the percentage of correct answers between the years observed (EMP percentage 0.24; 95% CI −0.03 to 0.51; p = 0.08). The median percentage of correct answers per question for the period was 68% (IQR [p25-p75] 49-83%).
|
Variable |
Percentage of correct answers per question |
|||||
|---|---|---|---|---|---|---|
|
Year |
Number of questions |
Minimum |
p25 |
p50 |
p75 |
Maximum |
|
2009 |
90 |
0 |
48 |
63 |
85 |
100 |
|
2010 |
110 |
0 |
58 |
77 |
88 |
100 |
|
2011 |
120 |
6 |
50 |
68 |
82 |
100 |
|
2012 |
120 |
0 |
42,5 |
62 |
78 |
100 |
|
2013 |
120 |
8 |
46 |
65 |
79,5 |
100 |
|
2014 |
118 |
3 |
46 |
64 |
79 |
95 |
|
2015 |
120 |
0 |
47 |
64 |
78 |
98 |
|
2016 |
120 |
8 |
50 |
71 |
89 |
99 |
|
2017 |
120 |
25 |
54,5 |
67 |
78,5 |
93 |
|
2018 |
120 |
6 |
49 |
63,5 |
80 |
99 |
|
2019 |
120 |
6 |
51,5 |
73 |
88 |
100 |
|
2020 |
120 |
0 |
58 |
76 |
88 |
100 |
|
2021 |
120 |
5 |
40,5 |
67 |
83 |
100 |
|
2022 |
120 |
5 |
50 |
71 |
86 |
98 |
|
2023 |
104 |
6 |
55 |
72 |
86 |
100 |
|
Area |
Number of questions |
Minimum |
p25 |
p50 |
p75 |
Maximum |
|
Adult |
825 |
0 |
51 |
69 |
85 |
100 |
|
Basic and Applied Health Science Adult |
488 |
0 |
50 |
68 |
81 |
100 |
|
Pediatrics |
429 |
0 |
46 |
65 |
81 |
100 |
|
Subarea |
Number of questions |
Minimum |
p25 |
p50 |
p75 |
Maximum |
|
Musculoskeletal Trauma |
496 |
0 |
51 |
73 |
87 |
100 |
|
Spine |
232 |
0 |
46,5 |
64 |
77,5 |
100 |
|
Miscellaneous |
230 |
0 |
48 |
70 |
82 |
100 |
|
Knee |
156 |
6 |
44 |
67 |
82 |
100 |
|
Hip |
153 |
12 |
52 |
69 |
88 |
100 |
|
Ankle and Foot |
152 |
3 |
56 |
68 |
81 |
100 |
|
Hand |
132 |
9 |
44 |
63 |
83,5 |
100 |
|
Shoulder and Elbow |
114 |
4 |
47 |
64 |
79 |
97 |
|
Orthopedic Oncology |
77 |
22 |
56 |
67 |
79 |
100 |
When comparing the percentage of correct responses by area, a statistically significant difference was observed between adults (69% correct responses) and pediatrics (65% correct responses) (p-value = 0.003). When analyzing the subareas, significant differences were observed in the percentage of correct responses for musculoskeletal trauma (73%) versus hand, spine, shoulder and elbow, and knee (63% - 67%) ([Table 3]).
In the fractional regression analysis ([Table 4]), it was observed that the pediatric category presented significantly lower performance compared to the adult category, with 3.98% fewer correct answers on average (marginal effect) (p-value = 0.003). No significant differences were found between the basic sciences and adult areas (p-value = 0.215).
|
Area |
[a] AME percentage scale |
Standard error |
IC95% |
P value |
|
|
Adult |
reference |
||||
|
Basic and Applied Health Science Adult |
−1,68 |
1,36 |
−4,34 |
0,98 |
0,216 |
|
Pediatrics |
−3,98 |
1,33 |
−6,59 |
−1,36 |
0,003 |
|
Subarea |
[a] AME percentage scale |
Standard error |
IC95% |
P value |
|
|
Musculoskeletal Trauma |
reference |
||||
|
Miscellaneous |
−3,55 |
1,85 |
−7,17 |
0,06 |
0,055 |
|
Oncology |
−1,97 |
2,37 |
−6,63 |
2,68 |
0,407 |
|
Spine |
−7,08 |
1,87 |
−10,75 |
−3,40 |
<0,001 |
|
Shoulder and Elbow |
−5,40 |
2,34 |
−9,99 |
−0,82 |
0,021 |
|
Hand |
−4,97 |
2,32 |
−9,52 |
−0,42 |
0,032 |
|
Hip |
0,62 |
2,09 |
−3,49 |
4,72 |
0,768 |
|
Knee |
−4,57 |
2,17 |
−8,82 |
−0,32 |
0,035 |
|
Ankle and Foot |
−1,96 |
2,08 |
−6,04 |
2,12 |
0,347 |
When analyzed by subareas, it was identified that questions on the spine (AME percentage −7.08%, p-value < 0.001), shoulder and elbow (AME percentage −5.40%, p-value = 0.019), hand (AME percentage −4.97%, p-value = 0.03), and knee (AME percentage −4.57%, p-value = 0.033) were significantly less likely to be answered correctly compared to questions on musculoskeletal trauma (reference category) ([Figure 1]). While hip, ankle, and foot oncology and basic sciences did not show statistically significant differences regarding musculoskeletal trauma.


Discussion
Currently, the EMNOT plays an important role in assessing the performance of specialists recently graduated from Chilean programs and in revalidating the qualifications of specialists trained abroad. Previous studies have described the distribution of questions and their psychometric characteristics.[7] [8] This is the first study to evaluate the performance of those tested by subject area and subarea, during the period 2009–2023.
The main finding of the study was that the EMNOT showed stable theoretical performance over time, with a median correct response rate of 68%, similar to the pass rate of the OITE exam (66%).[13] It is important to note that all questions, regardless of category and/or subcategory, had more than 60% correct answers (the minimum percentage recommended for passing a theoretical exam).[14]
Since no differences were observed depending on the years of taking the exam, it can be inferred that the EMNOT maintains a constant difficulty over time. This finding is particularly relevant because, although there are studies that have documented variations in OITE exam scores depending on the year of residency,[15] as well as on the stability of the distribution of questions in terms of areas and taxonomy,[13] [16] [17] no studies were found that directly evaluate the stability of the test takers' performance over time.
Although in our study the EMNOT showed stable performance over time, differences in performance were evident by subject area. When evaluating the PRC by category, it was observed that the pediatric questions had an average of 3.98% fewer correct answers than the adult questions. It is important to note that although the PRC was significantly lower in pediatrics, the median PRC was 65%, higher than the recommended passing score (60%). Regarding the number of pediatric questions, it is interesting to mention that in the American OITE exam, questions from this subject area account for between 11 and 14% of the total,[13] [18] [19] [20] while in the EMNOT, the percentage of questions is higher, accounting for 24.5% of the total (n = 429 questions). When analyzing performance by subareas, better performance was observed in questions on musculoskeletal trauma, compared to questions on the spine, shoulder and elbow, hand, and knee areas.
The lower performance in pediatrics (versus adult) and in the spine, shoulder, and elbow, hand, and knee (versus musculoskeletal trauma) subfields could be explained by residents' less clinical or academic exposure in these areas during training, greater complexity of knowledge, and/or differences in question formulation (complexity, clarity of statement, and quality of alternatives). However, it is important to emphasize that the existence of statistical differences does not necessarily imply a clinically relevant training gap. For example, in the case of the pediatric and adult areas, the difference in the PRC was only 4 percentage points, which, although statistically significant, may not be practically relevant for knowledge acquisition.
This study has some limitations, primarily due to its retrospective observational nature. Individual variables such as training center or resident performance during training were not considered in the analysis. Nor were variables such as taxonomy, discrimination index, and question quality assessment analyzed, although these variables may be associated with the percentage of correct answers. A strength is that the entire EMNOT question pool was studied over a 15-year period.
Future studies should explore the relationship between EMNOT performance and specific training program characteristics, the impact of curricular reforms on performance by subspecialty, and assess the quality of exam questions.
Conclusion
The Chilean National Medical Examination in Orthopedics and Traumatology is a training tool with adequate difficulty and stable performance over time. The median number of questions with correct answers was above the recommended passing threshold; however, significant differences were observed by subject area and subcategory. Specifically, the Adult and Musculoskeletal Trauma content showed better performance compared to Pediatrics and other subareas such as spine, hand, shoulder, and elbow.
No conflict of interest has been declared by the author(s).
Acknowledgement
We thank the Teaching Committee of the Chilean Society of Orthopedics and Traumatology for their support of this line of research.
-
Referencias Bibliográficas
- 1 Dougherty PJ, Walter N, Schilling P, Najibi S, Herkowitz H. Do scores of the USMLE Step 1 and OITE correlate with the ABOS Part I certifying examination?: a multicenter study. Clin Orthop Relat Res 2010; 468 (10) 2797-2802
- 2 Cho Y, Kim JY, Park JH. Analysis of the Korean Orthopedic In-Training Examination: The Hip and Pelvis Section. Hip Pelvis 2016; 28 (03) 157-163
- 3 Fones L, Osbahr DC, Davis DE, Star AM, Ahmed AK, Saxena A. Analysis of Orthopaedic In-Training Examination Trauma Questions: 2017 to 2021. J Am Acad Orthop Surg Glob Res Rev 2023; 7 (03) e22.00180
- 4 American Academy of Orthopaedic Surgeons. About ABOS [Internet]. 2020 [citado 7 jul 2025]. Disponible en: https://www.aaos.org/about/
- 5 Le HV, Wick JB, Haus BM, Dyer GSM. Orthopaedic In-Training Examination: History, Perspective, and Tips for Residents. J Am Acad Orthop Surg 2021; 29 (09) e427-e437
- 6 Swanepoel S, Dunn R, Klopper J, Held M. The FC Orth(SA) final examination: how effective is the written component?. SA Orthop J 2018 ;17(3)
- 7 Lira MJ, Besa P, Irarrázaval S. et al. Evaluation of the Chilean National Orthopaedic Examination over 11 years: progress and outcomes of national and international examinees. J Am Acad Orthop Surg Glob Res Rev 2024; 8 (01) e23.00168
- 8 Urrutia J, Orrego M, Wright AC, Amenabar D. An assessment of the Chilean National Examination of Orthopaedic Surgery. BMC Med Educ 2016; 16 (01) 78
- 9 Ministerio de Salud. Norma Técnica N° 145. Certificación de especialidades médicas [Internet]. Santiago: Biblioteca del Congreso Nacional; 2017. [citado 7 jul 2025]. Disponible en: https://www.bcn.cl/leychile/navegar?idNorma=1048964
- 10 APICE. Requisitos específicos de un programa de formación de especialistas en Traumatología y Ortopedia [Internet]. Santiago: APICE; 2014. [citado 7 jul 2025]. Disponible en: http://www.apicechile.cl/images/stories/doc/imagenes/traumatologia_ortopedia.pdf
- 11 Buckwalter JA, Schumacher R, Albright JP, Cooper RR. Use of an educational taxonomy for evaluation of cognitive performance. J Med Educ 1981; 56 (02) 115-121
- 12 Meaney C, Moineddin R. A Monte Carlo simulation study comparing linear regression, beta regression, variable-dispersion beta regression and fractional logit regression at recovering average difference measures in a two sample design. BMC Med Res Methodol 2014; 14 (01) 14
- 13 American Academy of Orthopaedic Surgeons. Orthopaedic In-Training Examination (OITE): Technical Report 2022.
- 14 Cusimano MD. Standard setting in medical education. Acad Med 1996; 71 (10, Suppl) S112-S120
- 15 DePasse JM, Haglin J, Eltorai AEM, Mulcahey MK, Eberson CP, Daniels AH. Orthopedic in-training examination question metrics and resident test performance. Orthop Rev (Pavia) 2017; 9 (02) 7006
- 16 Lee E, O'Sullivan L, Spiker AM. Analysis of Hip Preservation Questions on the Orthopaedic In-Training Examination Over the Past 20 Years. Orthop J Sports Med 2024;12(5):23259671241237503
- 17 Klein B, LaGreca M, White PB, Trasolini R, Cohn RM. Comparison of Sports Medicine Questions on the Orthopaedic In-Training Examination Between 2009 and 2012 and 2017 and 2020 Reveals an Increasing Number of References. Arthrosc Sports Med Rehabil 2023; 5 (02) e479-e488
- 18 Papp DF, Ting BL, Sargent MC, Frassica FJ. Analysis of the Pediatric Orthopedic Surgery Questions on the Orthopaedic In-Training Examination, 2002 through 2006 [Internet]. 2010 [citado 7 jul 2025]. Disponible en: http://www.pedorthopaedics.com
- 19 Murphy RF, Nunez L, Barfield WR, Mooney JF. Evaluation of Pediatric Questions on the Orthopaedic In-Training Examination: An Update [Internet]. 2016 [citado 7 jul 2025]. Disponible en: http://www.pedorthopaedics.com
- 20 Ellsworth BK, Premkumar A, Shen T, Lebrun DG, Cross MB, Widmann RF. An updated analysis of the pediatric section of the Orthopaedic In-Training Examination. J Pediatr Orthop 2020; 40 (10) e1017-e1021
Address for correspondence
Publication History
Received: 07 July 2025
Accepted: 07 August 2025
Article published online:
22 December 2025
© 2025. Sociedad Chilena de Ortopedia y Traumatologia. This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)
Thieme Revinter Publicações Ltda.
Rua Rego Freitas, 175, loja 1, República, São Paulo, SP, CEP 01220-010, Brazil
-
Referencias Bibliográficas
- 1 Dougherty PJ, Walter N, Schilling P, Najibi S, Herkowitz H. Do scores of the USMLE Step 1 and OITE correlate with the ABOS Part I certifying examination?: a multicenter study. Clin Orthop Relat Res 2010; 468 (10) 2797-2802
- 2 Cho Y, Kim JY, Park JH. Analysis of the Korean Orthopedic In-Training Examination: The Hip and Pelvis Section. Hip Pelvis 2016; 28 (03) 157-163
- 3 Fones L, Osbahr DC, Davis DE, Star AM, Ahmed AK, Saxena A. Analysis of Orthopaedic In-Training Examination Trauma Questions: 2017 to 2021. J Am Acad Orthop Surg Glob Res Rev 2023; 7 (03) e22.00180
- 4 American Academy of Orthopaedic Surgeons. About ABOS [Internet]. 2020 [citado 7 jul 2025]. Disponible en: https://www.aaos.org/about/
- 5 Le HV, Wick JB, Haus BM, Dyer GSM. Orthopaedic In-Training Examination: History, Perspective, and Tips for Residents. J Am Acad Orthop Surg 2021; 29 (09) e427-e437
- 6 Swanepoel S, Dunn R, Klopper J, Held M. The FC Orth(SA) final examination: how effective is the written component?. SA Orthop J 2018 ;17(3)
- 7 Lira MJ, Besa P, Irarrázaval S. et al. Evaluation of the Chilean National Orthopaedic Examination over 11 years: progress and outcomes of national and international examinees. J Am Acad Orthop Surg Glob Res Rev 2024; 8 (01) e23.00168
- 8 Urrutia J, Orrego M, Wright AC, Amenabar D. An assessment of the Chilean National Examination of Orthopaedic Surgery. BMC Med Educ 2016; 16 (01) 78
- 9 Ministerio de Salud. Norma Técnica N° 145. Certificación de especialidades médicas [Internet]. Santiago: Biblioteca del Congreso Nacional; 2017. [citado 7 jul 2025]. Disponible en: https://www.bcn.cl/leychile/navegar?idNorma=1048964
- 10 APICE. Requisitos específicos de un programa de formación de especialistas en Traumatología y Ortopedia [Internet]. Santiago: APICE; 2014. [citado 7 jul 2025]. Disponible en: http://www.apicechile.cl/images/stories/doc/imagenes/traumatologia_ortopedia.pdf
- 11 Buckwalter JA, Schumacher R, Albright JP, Cooper RR. Use of an educational taxonomy for evaluation of cognitive performance. J Med Educ 1981; 56 (02) 115-121
- 12 Meaney C, Moineddin R. A Monte Carlo simulation study comparing linear regression, beta regression, variable-dispersion beta regression and fractional logit regression at recovering average difference measures in a two sample design. BMC Med Res Methodol 2014; 14 (01) 14
- 13 American Academy of Orthopaedic Surgeons. Orthopaedic In-Training Examination (OITE): Technical Report 2022.
- 14 Cusimano MD. Standard setting in medical education. Acad Med 1996; 71 (10, Suppl) S112-S120
- 15 DePasse JM, Haglin J, Eltorai AEM, Mulcahey MK, Eberson CP, Daniels AH. Orthopedic in-training examination question metrics and resident test performance. Orthop Rev (Pavia) 2017; 9 (02) 7006
- 16 Lee E, O'Sullivan L, Spiker AM. Analysis of Hip Preservation Questions on the Orthopaedic In-Training Examination Over the Past 20 Years. Orthop J Sports Med 2024;12(5):23259671241237503
- 17 Klein B, LaGreca M, White PB, Trasolini R, Cohn RM. Comparison of Sports Medicine Questions on the Orthopaedic In-Training Examination Between 2009 and 2012 and 2017 and 2020 Reveals an Increasing Number of References. Arthrosc Sports Med Rehabil 2023; 5 (02) e479-e488
- 18 Papp DF, Ting BL, Sargent MC, Frassica FJ. Analysis of the Pediatric Orthopedic Surgery Questions on the Orthopaedic In-Training Examination, 2002 through 2006 [Internet]. 2010 [citado 7 jul 2025]. Disponible en: http://www.pedorthopaedics.com
- 19 Murphy RF, Nunez L, Barfield WR, Mooney JF. Evaluation of Pediatric Questions on the Orthopaedic In-Training Examination: An Update [Internet]. 2016 [citado 7 jul 2025]. Disponible en: http://www.pedorthopaedics.com
- 20 Ellsworth BK, Premkumar A, Shen T, Lebrun DG, Cross MB, Widmann RF. An updated analysis of the pediatric section of the Orthopaedic In-Training Examination. J Pediatr Orthop 2020; 40 (10) e1017-e1021




