J Wrist Surg 2019; 08(02): 104-107
DOI: 10.1055/s-0038-1673335
Scientific Article
Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

Moderate Interrater Reliability in the Diagnosis of Scaphoid Waist Fractures

Rasmus Wejnold Jørgensen
1   Department of Orthopedics, Hand Clinic, Herlev-Gentofte University Hospital of Copenhagen, Copenhagen, Denmark
,
Claus Hjorth Jensen
1   Department of Orthopedics, Hand Clinic, Herlev-Gentofte University Hospital of Copenhagen, Copenhagen, Denmark
› Author Affiliations
Further Information

Publication History

04 May 2018

02 August 2018

Publication Date:
18 October 2018 (online)

Abstract

Background Conventional radiographs have been shown to yield unreliable results in classifying scaphoid fractures. Computed tomography (CT) has been claimed to be the tool of choice in determining the treatment as well as fracture displacement.

Purpose The purpose of the study was to examine the interrater reliability and intrarater reproducibility in the decision-making of the treatment of scaphoid waist fractures.

Patients and Methods Fifty-one CT scans of scaphoid waist fractures were utilized. Seven orthopaedic surgeons with a particular interest in hand surgery independently scrutinized the scans classifying each in undisplaced, < 2 mm displaced, or > 2 mm displaced, and suggested a treatment of immobilization in cast or screw fixation. The Fleiss' and Cohen's kappa values using SPSS (Statistical Package for Social Science) version 24 were calculated and interpreted according to Landis and Koch.

Results The kappa value representing interrater reliability when choosing between operative or nonoperative treatment was 0.58. Interrater reliability of the distinction between < 2 mm displaced or > 2 mm displaced fractures was 0.61. On average 79.5% of the fractures were suggested treated nonoperatively and 20.5% operatively. Overall, intrarater reproducibility was 0.75 when classifying between < 2 mm displaced or > 2 mm displaced fractures. When choosing between operative or nonoperative treatment, intrarater reproducibility was 0.69.

Conclusion Moderate interrater reliability was found when choosing between nonoperative and operative treatment. The use of CT showed substantial reliability in the distinction between < 2 mm displaced and > 2 mm displaced fractures. Intrarater reproducibility was substantial when classifying between < 2 mm displaced and > 2 mm displaced fractures as well as when choosing between operative or nonoperative treatment.

Level of Evidence This is a Level III study.

Ethical Approval

No human or personal human data were involved in this study.


 
  • References

  • 1 Grewal R, King GJ. An evidence-based approach to the management of acute scaphoid fractures. J Hand Surg Am 2009; 34 (04) 732-734
  • 2 Temple CL, Ross DC, Bennett JD, Garvin GJ, King GJ, Faber KJ. Comparison of sagittal computed tomography and plain film radiography in a scaphoid fracture model. J Hand Surg Am 2005; 30 (03) 534-542
  • 3 Davis TR. Prediction of outcome of non-operative treatment of acute scaphoid waist fracture. Ann R Coll Surg Engl 2013; 95 (03) 171-176
  • 4 Dias JJ, Taylor M, Thompson J, Brenkel IJ, Gregg PJ. Radiographic signs of union of scaphoid fractures. An analysis of inter-observer agreement and reproducibility. J Bone Joint Surg Br 1988; 70 (02) 299-301
  • 5 Smith M, Bain GI, Turner PC, Watts AC. Review of imaging of scaphoid fractures. ANZ J Surg 2010; 80 (1,2): 82-90
  • 6 Buijze GA, Wijffels MM, Guitton TG, Grewal R, van Dijk CN, Ring D. ; Science of Variation Group. Interobserver reliability of computed tomography to diagnose scaphoid waist fracture union. J Hand Surg Am 2012; 37 (02) 250-254
  • 7 de Zwart AD, Beeres FJ, Kingma LM, Otoide M, Schipper IB, Rhemrev SJ. Interobserver variability among radiologists for diagnosis of scaphoid fractures by computed tomography. J Hand Surg Am 2012; 37 (11) 2252-2256
  • 8 Gjørup T. Reliability of diagnostic tests. Acta Obstet Gynecol Scand Suppl 1997; 166: 9-14
  • 9 Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics 1977; 33 (01) 159-174
  • 10 Walter SD, Eliasziw M, Donner A. Sample size and optimal designs for reliability studies. Stat Med 1998; 17 (01) 101-110
  • 11 Kottner J, Audige L, Brorson S. , et al. Guidelines for Reporting Reliability and Agreement Studies (GRRAS) were proposed. Int J Nurs Stud 2011; 48 (06) 661-671