Ultraschall Med 2017; 38(06): 642-647
DOI: 10.1055/s-0043-119354
Original Article
© Georg Thieme Verlag KG Stuttgart · New York

Cost-Effectiveness of Mobile App-Guided Training in Extended Focused Assessment with Sonography for Trauma (eFAST): A Randomized Trial

Kosteneffizienz des App-basierten Selbststudiums im „Extended Focused Assessment with Sonography for Trauma“ (eFAST): Eine randomisierte Studie
Philip Mørkeberg Nilsson
1   Simulationscenter Rigshospitalet, Copenhagen Academy for Medical Education and Simulation, Copenhagen, Denmark
,
Tobias Todsen
1   Simulationscenter Rigshospitalet, Copenhagen Academy for Medical Education and Simulation, Copenhagen, Denmark
2   Department of Otorhinolaryngology and Maxillofacial Surgery, Zealand University Hospital, Køge, Denmark
,
Yousif Subhi
3   Department of Ophthalmology, Zealand University Hospital, Roskilde, Denmark
,
Ole Graumann
4   Dep. of Radiology, Odense University Hospital, Denmark
5   Institute of Clinical Research, University of Southern Denmark, Odense, Denmark
6   Institute of Clinical Research, Aarhus University, Aarhus, Denmark
,
Christian Pallson Nolsøe
1   Simulationscenter Rigshospitalet, Copenhagen Academy for Medical Education and Simulation, Copenhagen, Denmark
7   Ultrasound Section, Division of Surgery, Dep of Gastroenterology, Herlev Hospital, Herlev, Denmark
,
Martin Grønnebæk Tolsgaard
1   Simulationscenter Rigshospitalet, Copenhagen Academy for Medical Education and Simulation, Copenhagen, Denmark
8   Department of Obstetrics and Gynecology, Nordsjællands Hospital, Hillerød, Denmark
› Author Affiliations
Further Information

Correspondence

Philip Mørkeberg Nilsson
Simulationscenter Rigshospitalet, Copenhagen Academy for Medical Education and Simulation
Blegdamsvej 9
DK-2100 Copenhagen
Denmark   
Phone: ++ 45 35 45 54 04   

Publication History

02 March 2017

20 August 2017

Publication Date:
26 September 2017 (online)

 

Abstract

Purpose Ultrasound training is associated with a long learning curve and use of substantial faculty resources. Self-directed ultrasound training may decrease the need for faculty-led teaching. Mobile apps seem promising for use in self-directed ultrasound training, but no studies have examined the cost-effectiveness of mobile app-guided training versus traditional formats such as textbook-guided training. This study evaluated the cost-effectiveness of mobile app-guided versus textbook-guided ultrasound training.

Material and methods First-year residents (n = 38) with no previous ultrasound experience were randomized into mobile app-guided versus textbook-guided self-directed ultrasound training groups. Participants completed a transfer test involving four patient cases and a theoretical test on diagnostic accuracy. Two ultrasound experts assessed the residents’ performance using the Objective Structured Assessment of Ultrasound Skills (OSAUS) scale. The costs of developing mobile app and textbook material were calculated and used for the analysis of cost-effectiveness.

Results 34 participants completed the transfer test. There was no statistically significant difference in test performance or diagnostic accuracy between the mobile app-guided (mean-OSAUS 42.3 % [95 %CI38.5 – 46.0 %]) and textbook-guided groups (mean-OSAUS 45.3 % [95 %CI39.3 – 51.3 %]) (d.f. [1.33] = 0.45, p = 0.41). However, development costs differed greatly for each instructional format. Textbook-guided training was significantly more cost-effective than mobile app-guided training (Incremental Cost Effectiveness Ratio -861 967 [95 %CI-1071.7 to-3.2] USD/pct. point change in OSAUS score).

Conclusion Mobile app-guided ultrasound training is less cost-effective than textbook-guided self-directed training. This study underlines the need for careful evaluation of cost-effectiveness when introducing technological innovations for clinical skills training.


#

Zusammenfassung

Ziel Das Erlernen der Sonografie ist in der Regel ein langwieriger, mit erheblichen institutionellen Kosten verbundener Prozess. Individuelles Selbststudium vermag das Bedürfnis nach institutionellem Unterricht möglichweise verringern. Apps sind in diesem Zusammenhang eine vielversprechende Alternative, jedoch wurde bisher in keiner Studie die Kosteneffizienz von Apps, mit der Kosteneffizienz traditioneller Medien, sowie beispielsweise der von Lehrbüchern, untersucht. Das Ziel dieser Studie war die Auswertung, sowie das Vergleichen der Kosteneffizienz von App-basiertem und lehrbuchbasiertem Erlernen der Sonografie.

Material, Methoden Ärzte im Praktikum (n = 38) ohne vorherige sonografische Erfahrung wurden durch Randomisierung in eine App- und eine Lehrbuchgruppe eingeteilt. Die Teilnehmer absolvierten jeweils eine praktische und eine theoretische Prüfung. Der praktische Teil bestand aus vier patientenbasierten Fällen, während durch den theoretischen Teil die diagnostische Genauigkeit der Teilnehmer bewertet wurde. Zwei Sonografie-Experten beurteilten die Leistung der Teilnehmer durch Anwendung der "Objective Structured Assessment of Ultrasound Skills" (OASUS) Messskala. Kosten für die Entwicklung der App, sowie der Ausarbeitung des Lehrbuches wurden berechnet und die beiden Lernmedien wurden jeweils auf ihre Kosteneffizienz untersucht.

Ergebnisse 34 Teilnehmer absolvierten den Test. In Hinblick auf die diagnostische Genauigkeit, konnte kein statistisch signifikanter Unterschied zwischen der App-Gruppe (durchschnittliche OSAUS-Bewertung 42.3 % [95 %CI38.5 – 46.0 %]) und der Lehrbuch-Gruppe (durchschnittliche OSAUS-Bewertung 45.3 % [95 %CI39.3 – 51.3 %]) ausgemacht werden (d.f. [1, 33] = 0.45, p = 0.41). Bei den Entwicklungskosten der beiden Lernmedien wurden jedoch erhebliche Abweichungen aufgedeckt. Somit erwies sich das lehrbuchbasierte Erlernen der sonografischen Diagnostik, im Vergleich zum App-basierten Erlernen, als wesentlich kosteneffizienter.

Schlussfolgerung Die Anwendung der Sonografie lässt sich kosteneffizienter durch lehrbuchbasiertes, als durch App-basiertes Selbststudium erlernen. Bevor neue technologische Innovationen für das Erlernen praktischer Fähigkeiten implementiert werden, sollten diese sorgfältig auf ihre jeweilige Kosteneffizienz untersucht werden, welches durch die Ergebnisse dieser Studie verdeutlicht wird.


#

Introduction

The use of point-of-care ultrasound by clinicians, such as Focused Assessment with Sonography in Trauma (FAST), has increased [1]. The adoption of ultrasound into clinical practice has been facilitated by the development of portable low-cost ultrasound equipment [2] and training guidelines such as the Advanced Trauma Life Support algorithm [3]. However, ultrasound is still highly operator dependent, and sufficient hands-on training is needed to ensure diagnostic accuracy [4] [5]. Traditional educational practices often involve considerable amounts of dedicated faculty time for teaching and assessment [5]. The resource-intensive nature of ultrasound training, therefore, calls for more efficient methods to ensure adequate training at reasonable costs.

Smartphones and tablets are gaining popularity for acquiring knowledge and skills related to clinical tasks, through the use of mobile applications (apps) [6] [7]. Apps on these devices are software programs that can integrate videos, pictures, text, and weblinks into one coherent learning platform [8]. Mobile app-guided training may be particularly suitable for learning clinical skills [9] [10], because mobile apps allow learners to view video demonstrations and written material during self-directed hands-on practice. Despite the increasing use of mobile apps in medical education [11] [12], no previous studies have investigated the effectiveness of mobile app-guided training or its costs compared to traditional learning formats [13]. Hence, the aim of this study was to evaluate both the cost-effectiveness of mobile app-guided ultrasound training compared to textbook-guided ultrasound training, and learning outcomes in terms of clinical performance for first-year residents with no previous ultrasound experience.


#

Methods

A randomized controlled trial was conducted at our institution. The study was designed as a rater-blinded superiority study with 1:1 randomization. Ethical approval was obtained in the form of an exemption letter from the Regional Ethical Committee of the Capital Region, Denmark (protocol no. H-3 – 2014-FSP17). The study is reported in accordance with the CONSORT statement (http://www.consort-statement.org) and registered at http://www.clinicaltrials.gov (identification no. NCT02 156 921).

Newly graduated interns employed at a hospital in the Capital Region of Denmark and the Region of Zealand were invited by e-mail to participate in the study and were included on a first-come, first-served basis. The inclusion criterion was less than one year of postgraduate clinical experience. The exclusion criteria were prior practical ultrasound experience, such as performing independent ultrasound scans, or participation in any postgraduate ultrasound courses, to ensure the same level of competence among participants.

Participants were randomized using a computer-based online randomization program [14] into either the intervention group (mobile app-guided training) or the control group (textbook-guided training).

The participants in both groups completed a two-hour ultrasound training session focusing on Extended Focused Assessment with Sonography for Trauma (eFAST) [15]. Extended refers to the scanning for pneumothorax, not included in the original FAST examination. Participants were instructed to practice eFAST as demonstrated in the mobile app (intervention group) or the textbook chapter (control group) (see [Fig. 1]). They were guided to practice each of the six different projections in the eFAST scan (intercostal bilaterally, pericardial, subphrenic bilaterally, and bladder view) separately, before they finally practiced a full session of the entire eFAST scan. The ultrasound machine that they used was a General Electric, Logic e with a convex transducer, 2.0 – 5.5Mhz (GE Healthcare, Little Chalfont, UK).

Zoom Image
Fig. 1 a Example of textbook-guided training (control); b example of mobile app-guided training (intervention).

An iPad 3 (Apple, Cupertino, USA), loaded with a mobile app on eFAST examination, was used for hands-on training in the intervention group. Using video clips, an experienced radiologist demonstrated how to perform the different views in an eFAST. The mobile app also contained pictures and text regarding normal and pathological findings and an introduction module that teaches how to use the ultrasound equipment and optimize images. The mobile app used in this study can be accessed at http://pmn2.cekuapp.dk/ (in Danish).

The control group had the same training circumstances as the intervention group but instead were provided with a text- and picture-based book chapter containing the same curriculum as provided in the mobile app designed for the intervention group. For example, instead of the video instruction in the mobile app, the book chapter had step-by-step written instructions and still photos. The textbook chapter used in this study can be accessed at http://pmn2.cekuapp.dk/APPENDIX_eFAST_textbook_guided.pdf (in Danish).

The costs of developing both training formats were calculated by tracking the working hours for content development (video recording versus still photos), editing, and for media format production, with corresponding working hours for the radiologist and student assistants at our institution ([Table 1]). The written instructions in the textbook chapter (control group) and the spoken instructions in the video (intervention group) were kept as equal as possible, so that the time spent writing the instructions was equal for content production.

Table 1

Working hours and development costs. Costs converted from Danish Kroner to U.S. dollars.

Mobile app development

Textbook development

Radiologist’s hourly rate, USD

  82.9

  82.9

Student assistant’s hourly rate, USD

  37.7

  37.7

Content production

Radiologist’s time, hours

  12

   4

Student assistant’s time, hours

  29

  10

Total content production cost, USD

2088.1

 708.6

Editing

Student assistant’s time, hours

  24

   6

Total editing cost, USD

 904.8

 226.2

Media format production

Student assistant’s time, hours

  20

   6

Total media production cost, USD

 754

 226.2

Total working hours

  85

  26

Total cost, USD

3746.9

1161

Participant performance was assessed with a transfer test two weeks after completion of the training session. The test setup included four cases: one patient with free fluid in the abdomen (patient with peritoneal dialysis recruited from Department of Nephrology, Rigshospitalet, Denmark) and three simulated patients without eFAST-verifiable pathology. Short written descriptions of the history of trauma and clinical symptoms were provided for each case. Performance was recorded via video of hand movements and video output from the ultrasound equipment. Participants dictated their findings and ultrasound diagnoses after performing each of the ultrasound scans. The video recordings of each technical performance were merged with the ultrasound screen recording in one anonymized video clip. The Objective Structured Assessment of Ultrasound Skills (OSAUS) scale was used for all performance assessments (available at http://www.osaus.org/osaus.pdf) [16]. Two items in the OSAUS were excluded, as they do not apply in experimental settings. Evidence of the OSAUS rating scale’s validity for assessing competence in point-of-care ultrasound, including the FAST algorithm, has been explored in several previous studies [16] [17] [18]. Based on a prior generalizability analysis, we estimated that two trained assessors and four different patient cases would ensure a highly reliable OSAUS score in our experimental study [18]. Two consultant radiologists with 11 and 32 years of ultrasound experience rated performance using secure web-based video rating software [19]. Prior to rating performance, the two radiologists received rater training by assessing ultrasound performance in five cases outside of this study to reach consensus.

After the practical performance test, all participants completed a theoretical test to assess their diagnostic accuracy. Participants were shown short video sequences of the eFAST scan projections and were instructed to record whether or not pathology was present. In total, participants answered 40 questions about eFAST-verifiable pathology.

Power calculation

We estimated a minimal clinically and educationally significant difference in OSAUS score of 9 % (SD 7.7) between groups, based on data from a previous study on point-of-care ultrasound training [4]. A sample size of 16 in each group was necessary to achieve a power of 0.90, with an alpha level of 0.05.


#

Statistics

All OSAUS scores were converted to a percentage of maximum possible OSAUS score. A mixed-design (2 × 4) repeated-measures analysis of variance was performed using OSAUS scores across the four transfer test cases to assess the main effects of training and to test for interactions with type of training (mobile app-guided or textbook-guided training). The inter-rater reliability between the two raters was calculated using Crohnbach’s alpha to assess degree of consistency.

Scores on the theoretical test for diagnostic accuracy were calculated into one test score by assigning the value of one for correct answers and zero for incorrect answers. These test scores were compared between groups using the Mann-Whitney U test.

The cost-effectiveness of the two training formats was evaluated using the Programme Effectiveness and Cost Generalization model for conducting cost-effectiveness analyses in medical education [20]. An incremental cost-effectiveness ratio (ICER) was calculated as the difference in cost between the two interventions, divided by the difference in their effectiveness (OSAUS scores in percent). Hence, this ratio provides information on how much one additional OSAUS percentage point costs. Statistical analyses were performed using SPSS Inc. Version 23 (Chicago, IL) and Excel 2011.


#
#

Results

38 first-year residents participated in training with 34 completing the transfer test, resulting in 136 recorded ultrasound cases (see flowchart of participants, [Fig. 2]). A single case from the intervention group had to be excluded from analysis due to a technical error in the recording equipment. The male/female ratio in the app-guided group was 38/62 % with a median age of 29 years (range: 26 – 34). The male/female ratio in the textbook-guided group was 56/44 % with a median age of 29 years (range: 26 – 35).

Zoom Image
Fig. 2 Flowchart of participants.

The mobile app-guided group had a mean OSAUS score of 42.3 % [95 %CI 38.5 – 46.0 %] and the textbook-guided group had a mean OSAUS score of 45.3 % [95 %CI 39.3 – 51.3 %]. There were no between-group effects (d.f. [1.33] = 0.45, p = 0.41) across the four transfer test cases, and there were no significant interactions between the effects of training and the type of training on the four transfer test cases (d.f. [3.33] = 1.072, p = 0.38). The inter-rater reliability of the assessments for the two raters was Crohnbach’s alpha = 0.68.

In the theoretical test for diagnostic accuracy, the median for correct answers was 36 (IQR = 2.44) for the intervention group and 37 (IQR = 4.0) for the control group. There was no difference in the number of correct answers (Mann-Whitney U = 104.0, p = 0.163) between groups.

Time and cost estimates for development of the two interventions are shown in [Table 1]. The raw costs of developing the mobile app-guided and the textbook-guided training approaches were 3746.9 USD and 1161 USD, respectively. The ICER was -861.967 [95 % CI -1071.7 to -3.2] USD/pct. point change in OSAUS score. Hence, the textbook-guided training was significantly more cost-effective than the mobile app-guided training.


#

Discussion

There was no advantage to mobile app-guided ultrasound training over textbook-guided training with respect to either clinical performance or diagnostic accuracy. However, the monetary cost was more than three times higher for the development of the mobile app compared with the textbook. Consequently, the economic analysis showed that the textbook-guided ultrasound training format was significantly more cost-effective compared to mobile app-guided training. 

With the increasing use of point-of-care ultrasound, it is important to establish best practices for training clinicians, which may include the use of new learning technology. However, with the introduction of new technologies, there is often significant hype regarding their potential [21]. Our study highlights the importance of careful evaluation of the proposed potential of technological innovations before their adoption into clinical practice. In other words, the apparent advantages associated with app-guided clinical skills training must be weighed against their effectiveness and costs. Unfortunately, until now only a few studies in medical education have taken cost into account when evaluating new learning technology [22].

Contrary to our initial hypothesis, we found no clinically meaningful difference in transfer task performance between learners using mobile apps and textbooks. According to multimedia learning theory, the advantages associated with the integration of visual, auditory, and text explanations should have reduced the cognitive resources required by the participants using the mobile app compared to those who used textbooks [23]. On the other hand, recent studies have shown that learners tend to exhibit lower performance in their diagnostic reasoning when using multimedia formats compared to text formats, due to an additional layer of task complexity [24] [25]. This gap is thought to affect novices, such as those included in the present study, more than experienced clinicians [25]. Consequently, different results may have been obtained if we had focused on more experienced clinicians [25] or clinical tasks with different levels of complexity. Accordingly, a recent systematic review on e-learning platforms for surgical training found significant heterogeneity between learning platforms in the analyses of their effectiveness [26]. Although effectiveness and costs are important factors when evaluating new technology, other aspects such as feasibility and user acceptability should also be considered to determine the utility of mobile apps for clinical practice [27]. The role of online resources and app-based training may also change over time. For example, personal interest in and user acceptance of mobile apps may continue to increase in the future, and shifts in learner preferences may affect the choice to invest resources in the development of apps even if they may not improve learning outcomes compared to conventional training methods. The group of trainees requiring ultrasound training may also continue to grow as an increasing number of medical schools introduce ultrasound training into their undergraduate curricula [28]. In particular for training large numbers of medical students, app-based training may become cost-effective as its costs remain relatively constant regardless of the number of users. This becomes increasingly important when learning materials need to be updated, distributed, and continuously revised. Finally, the direct access to online learning resources may play an increasingly important role for ultrasound training in geographically remote areas and in developing countries and this should be further explored in future research [29].

One of the strengths of this study is that we met the criteria for our power calculation, which was based on the least educationally and clinically significant difference. Moreover, although the inter-rater reliability was only moderate, the number of raters and cases was selected based on a previous generalizability study to ensure an acceptable overall reliability coefficient [18] . We were only able to include one patient with eFAST-verifiable pathology in the practical transfer test, which is a limitation, but we consequently added a theoretical test on diagnostic accuracy with 40 scans of e-FAST-verifiable pathology to extend the validity of our findings.

Finally, we focused on a simple point-of-care ultrasound examination involving one institution. Further studies are needed to determine the extent to which our findings can be generalized to other settings, procedures, and clinicians.


#

Conclusion

Mobile app-guided ultrasound training was significantly less cost-effective compared to textbook-guided ultrasound training. Mobile apps are increasingly used in clinical education, but this study suggests that the technological hype associated with mobile apps is not supported by evidence.


#
#

No conflict of interest has been declared by the author(s).

Acknowledgments

The authors would like to thank Peter Hertz for technical assistance, Mathis Gröning for German translation, and GE Medical for providing ultrasound machines. GE Medical had no influence on any part of this study.

  • References

  • 1 Rozycki GS. Surgeon-Performed Ultrasound. Ann Surg 1998; 228: 16-28
  • 2 Staren ED, Knudson MM, Rozycki GS. et al. An evaluation of the American College of Surgeons’ ultrasound education program. Am J Surg 2006; 191: 489-496
  • 3 ATLS Subcommittee, American College of Surgeons’ Committee on Trauma, International ATLS working group. Advanced trauma life support (ATLS®): the ninth edition. J Trauma Acute Care Surg 2013; 74: 1363-1366
  • 4 Todsen T, Jensen ML, Tolsgaard MG. et al. Transfer from point-of-care Ultrasonography training to diagnostic performance on patients--a randomized controlled trial. Am J Surg 2016; 211: 40-45
  • 5 European Federation of Societies for Ultrasound in Medicine and Biology. Recommendations available at: http://www.efsumb.org/guidelines/guidelines01.asp Accessed August 19 2016.
  • 6 Sclafani J, Tirrell TF, Franko OI. Mobile tablet use among academic physicians and trainees. J Med Syst 2013; 37: 9903
  • 7 Payne KFB, Wharrad H, Watts K. Smartphone and medical related App use among medical students and junior doctors in the United Kingdom (UK): a regional survey. BMC Med Inform Decis Mak 2012; 12: 121
  • 8 Masters K, Ellaway RH, Topps D. et al. Mobile technologies in medical education: AMEE Guide No. 105. Med Teach 2016; 38: 537-549
  • 9 Foss KT, Subhi Y, Aagaard R. et al. Developing an emergency ultrasound app – a collaborative project between clinicians from different universities. Scand J Trauma Resusc Emerg Med 2015; 23: 47
  • 10 Subhi Y, Foss KT, Henriksen M. et al. Development and use of web-based apps. Tidsskr Læring og Medier 2014; 7: 12
  • 11 Mosa ASM, Yoo I, Sheets L. A systematic review of healthcare applications for smartphones. BMC Med Inform Decis Mak 2012; 12: 67
  • 12 Ozdalga E, Ozdalga A, Ahuja N. The smartphone in medicine: a review of current and potential use among physicians and students. J Med Internet Res 2012; 14: e128
  • 13 Gaglani SM, Topol EJ. iMedEd: the role of mobile health technologies in medical education. Acad Med 2014; 89: 1207-1209
  • 14 Random.org, random number generator. Available at: http://www.random.org [accessed October 26, 2016]
  • 15 Kirkpatrick a W, Sirois M, Laupland KB. et al. Hand-Held Thoracic Sonography for Detecting Post-Traumatic Pneumothoraces: The Extended Focused Assessment With Sonography For Trauma (EFAST). J Trauma Inj Infect Crit Care 2004; 57: 288-295
  • 16 Tolsgaard MG, Todsen T, Sorensen JL. et al. International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLoS One 2013; 8: e57687
  • 17 Tolsgaard MG, Ringsted C, Dreisler E. et al. Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol 2014; 43: 437-443
  • 18 Todsen T, Tolsgaard MG, Olsen BH. et al. Reliable and valid assessment of point-of-care ultrasonography. Ann Surg 2015; 261: 309-315
  • 19 Subhi Y, Todsen T, Konge L. An integrable, web-based solution for easy assessment of video-recorded performances. Adv Med Educ Pract 2014; 5: 103-105
  • 20 Tolsgaard MG, Tabor A, Madsen ME. et al. Linking quality of care and training costs: cost-effectiveness in health professions education. Med Educ 2015; 49: 1263-1271
  • 21 Johnson AT. The technology hype cycle. IEEE Pulse 2015; 6: 50
  • 22 Zendejas B, Wang AT, Brydges R. et al. Cost: the missing outcome in simulation-based medical education research: a systematic review. Surgery Mosby, Inc 2013; 153: 160-176
  • 23 Merriënboer van JJG, Sweller J. Cognitive load theory in health professional education: design principles and strategies. Med Educ 2010; 44: 85-93
  • 24 Holtzman KZ, Swanson DB, Ouyang W. et al. Use of multimedia on the step 1 and step 2 clinical knowledge components of USMLE: a controlled trial of the impact on item characteristics. Acad Med 2009; 84: S90-S93
  • 25 Chang TP, Schrager SM, Rake AJ. et al. The effect of multimedia replacing text in resident clinical decision-making assessment. Adv Health Sci Educ Theory Pract Springer Netherlands 2016 Epub ahead op print.
  • 26 Maertens H, Madani A, Landry T. et al. Systematic review of e-learning for surgical training. Br J Surg 2016; 103: 1428-1437
  • 27 Vleuten Van Der CP. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract 1996; 1: 41-67
  • 28 Cantisani V, Dietrich CF, Badea R. et al. EFSUMB statement on medical student education in ultrasound (short version). . Ultraschall in Med  2016; 37: 100-102
  • 29 Lund S, Boas IM, Bedesa T. et al. Association Between the Safe Delivery App and Quality of Care and Perinatal Survival in Ethiopia: A Randomized Clinical Trial. JAMA Pediatr 2016; 170: 765-771

Correspondence

Philip Mørkeberg Nilsson
Simulationscenter Rigshospitalet, Copenhagen Academy for Medical Education and Simulation
Blegdamsvej 9
DK-2100 Copenhagen
Denmark   
Phone: ++ 45 35 45 54 04   

  • References

  • 1 Rozycki GS. Surgeon-Performed Ultrasound. Ann Surg 1998; 228: 16-28
  • 2 Staren ED, Knudson MM, Rozycki GS. et al. An evaluation of the American College of Surgeons’ ultrasound education program. Am J Surg 2006; 191: 489-496
  • 3 ATLS Subcommittee, American College of Surgeons’ Committee on Trauma, International ATLS working group. Advanced trauma life support (ATLS®): the ninth edition. J Trauma Acute Care Surg 2013; 74: 1363-1366
  • 4 Todsen T, Jensen ML, Tolsgaard MG. et al. Transfer from point-of-care Ultrasonography training to diagnostic performance on patients--a randomized controlled trial. Am J Surg 2016; 211: 40-45
  • 5 European Federation of Societies for Ultrasound in Medicine and Biology. Recommendations available at: http://www.efsumb.org/guidelines/guidelines01.asp Accessed August 19 2016.
  • 6 Sclafani J, Tirrell TF, Franko OI. Mobile tablet use among academic physicians and trainees. J Med Syst 2013; 37: 9903
  • 7 Payne KFB, Wharrad H, Watts K. Smartphone and medical related App use among medical students and junior doctors in the United Kingdom (UK): a regional survey. BMC Med Inform Decis Mak 2012; 12: 121
  • 8 Masters K, Ellaway RH, Topps D. et al. Mobile technologies in medical education: AMEE Guide No. 105. Med Teach 2016; 38: 537-549
  • 9 Foss KT, Subhi Y, Aagaard R. et al. Developing an emergency ultrasound app – a collaborative project between clinicians from different universities. Scand J Trauma Resusc Emerg Med 2015; 23: 47
  • 10 Subhi Y, Foss KT, Henriksen M. et al. Development and use of web-based apps. Tidsskr Læring og Medier 2014; 7: 12
  • 11 Mosa ASM, Yoo I, Sheets L. A systematic review of healthcare applications for smartphones. BMC Med Inform Decis Mak 2012; 12: 67
  • 12 Ozdalga E, Ozdalga A, Ahuja N. The smartphone in medicine: a review of current and potential use among physicians and students. J Med Internet Res 2012; 14: e128
  • 13 Gaglani SM, Topol EJ. iMedEd: the role of mobile health technologies in medical education. Acad Med 2014; 89: 1207-1209
  • 14 Random.org, random number generator. Available at: http://www.random.org [accessed October 26, 2016]
  • 15 Kirkpatrick a W, Sirois M, Laupland KB. et al. Hand-Held Thoracic Sonography for Detecting Post-Traumatic Pneumothoraces: The Extended Focused Assessment With Sonography For Trauma (EFAST). J Trauma Inj Infect Crit Care 2004; 57: 288-295
  • 16 Tolsgaard MG, Todsen T, Sorensen JL. et al. International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLoS One 2013; 8: e57687
  • 17 Tolsgaard MG, Ringsted C, Dreisler E. et al. Reliable and valid assessment of ultrasound operator competence in obstetrics and gynecology. Ultrasound Obstet Gynecol 2014; 43: 437-443
  • 18 Todsen T, Tolsgaard MG, Olsen BH. et al. Reliable and valid assessment of point-of-care ultrasonography. Ann Surg 2015; 261: 309-315
  • 19 Subhi Y, Todsen T, Konge L. An integrable, web-based solution for easy assessment of video-recorded performances. Adv Med Educ Pract 2014; 5: 103-105
  • 20 Tolsgaard MG, Tabor A, Madsen ME. et al. Linking quality of care and training costs: cost-effectiveness in health professions education. Med Educ 2015; 49: 1263-1271
  • 21 Johnson AT. The technology hype cycle. IEEE Pulse 2015; 6: 50
  • 22 Zendejas B, Wang AT, Brydges R. et al. Cost: the missing outcome in simulation-based medical education research: a systematic review. Surgery Mosby, Inc 2013; 153: 160-176
  • 23 Merriënboer van JJG, Sweller J. Cognitive load theory in health professional education: design principles and strategies. Med Educ 2010; 44: 85-93
  • 24 Holtzman KZ, Swanson DB, Ouyang W. et al. Use of multimedia on the step 1 and step 2 clinical knowledge components of USMLE: a controlled trial of the impact on item characteristics. Acad Med 2009; 84: S90-S93
  • 25 Chang TP, Schrager SM, Rake AJ. et al. The effect of multimedia replacing text in resident clinical decision-making assessment. Adv Health Sci Educ Theory Pract Springer Netherlands 2016 Epub ahead op print.
  • 26 Maertens H, Madani A, Landry T. et al. Systematic review of e-learning for surgical training. Br J Surg 2016; 103: 1428-1437
  • 27 Vleuten Van Der CP. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract 1996; 1: 41-67
  • 28 Cantisani V, Dietrich CF, Badea R. et al. EFSUMB statement on medical student education in ultrasound (short version). . Ultraschall in Med  2016; 37: 100-102
  • 29 Lund S, Boas IM, Bedesa T. et al. Association Between the Safe Delivery App and Quality of Care and Perinatal Survival in Ethiopia: A Randomized Clinical Trial. JAMA Pediatr 2016; 170: 765-771

Zoom Image
Fig. 1 a Example of textbook-guided training (control); b example of mobile app-guided training (intervention).
Zoom Image
Fig. 2 Flowchart of participants.