Appl Clin Inform 2022; 13(01): 139-147
DOI: 10.1055/s-0041-1742216
Research Article

Comparing the Use of DynaMed and UpToDate by Physician Trainees in Clinical Decision-Making: A Randomized Crossover Trial

Sally L. Baxter
1   Health Department of Biomedical Informatics, University of California San Diego, La Jolla, California, United States
2   Division of Ophthalmology Informatics and Data Science, Viterbi Family Department of Ophthalmology, Shiley Eye Institute, University of California San Diego, La Jolla, California, United States
,
Lina Lander
3   Department of Family Medicine, University of California San Diego, La Jolla, California, United States
,
Brian Clay
4   Department of Medicine, University of California San Diego, La Jolla, California, United States
,
John Bell
4   Department of Medicine, University of California San Diego, La Jolla, California, United States
,
Kristen Hansen
3   Department of Family Medicine, University of California San Diego, La Jolla, California, United States
,
Amanda Walker
3   Department of Family Medicine, University of California San Diego, La Jolla, California, United States
,
Ming Tai-Seale
3   Department of Family Medicine, University of California San Diego, La Jolla, California, United States
› Author Affiliations
Funding S.L.B. was supported by the National Institutes of Health/National Library of Medicine (training grant T15LM011271), the NIH Office of the Director (grant DP5OD029610), and an unrestricted departmental grant from Research to Prevent Blindness.

Abstract

Background Costs vary substantially among electronic medical knowledge resources used for clinical decision support, warranting periodic assessment of institution-wide adoption.

Objectives To compare two medical knowledge resources, UpToDate and DynaMed Plus, regarding accuracy and time required to answer standardized clinical questions and user experience.

Methods A crossover trial design was used, wherein physicians were randomized to first use one of the two medical knowledge resources to answer six standardized questions. Following use of each resource, they were surveyed regarding their user experience. The percentage of accurate answers and time required to answer each question were recorded. The surveys assessed ease of use, enjoyment using the resource, quality of information, and ability to assess level of evidence. Tests of carry-over effects were performed. Themes were identified within open-ended survey comments regarding overall user experience.

Results Among 26 participating physicians, accuracy of answers differed by 4 percentage points or less. For all but one question, there were no significant differences in the time required for completion. Most participants felt both resources were easy to use, contained high quality of information, and enabled assessment of the level of evidence. A greater proportion of participants endorsed enjoyment of use with UpToDate (23/26, 88%) compared with DynaMed Plus (16/26, 62%). Themes from open-ended comments included interface/information presentation, coverage of clinical topics, search functions, and utility for clinical decision-making. The majority (59%) of open-ended comments expressed an overall preference for UpToDate, compared with 19% preferring DynaMed Plus.

Conclusion DynaMed Plus is noninferior to UpToDate with respect to ability to achieve accurate answers, time required for answering clinical questions, ease of use, quality of information, and ability to assess level of evidence. However, user experience was more positive with UpToDate. Future studies of electronic medical knowledge resources should continue to emphasize evaluation of usability and user experience.

Protection of Human and Animal Subjects

The study was performed in compliance with the World Medical Association Declaration of Helsinki on Ethical Principles for Medical Research Involving Human Subjects, and was reviewed and approved by the University of California San Diego Institutional Review Board.


Supplementary Material



Publication History

Received: 18 August 2021

Accepted: 04 December 2021

Article published online:
02 February 2022

© 2022. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

 
  • References

  • 1 Scaffidi MA, Khan R, Wang C. et al. Comparison of the impact of Wikipedia, UpToDate, and a digital textbook on short-term knowledge acquisition among medical students: randomized controlled trial of three web-based resources. JMIR Med Educ 2017; 3 (02) e20
  • 2 Lander B, Balka E. Exploring how evidence is used in care through an organizational ethnography of two teaching hospitals. J Med Internet Res 2019; 21 (03) e10769
  • 3 O'Carroll AM, Westby EP, Dooley J, Gordon KE. Information-seeking behaviors of medical students: a cross-sectional web-based survey. JMIR Med Educ 2015; 1 (01) e4
  • 4 Loda T, Erschens R, Junne F, Stengel A, Zipfel S, Herrmann-Werner A. Undergraduate medical students' search for health information online: explanatory cross-sectional study. JMIR Med Inform 2020; 8 (03) e16279
  • 5 Graber MA, Randles BD, Ely JW, Monnahan J. Answering clinical questions in the ED. Am J Emerg Med 2008; 26 (02) 144-147
  • 6 Ebell MH. How to find answers to clinical questions. Am Fam Physician 2009; 79 (04) 293-296
  • 7 Addison J, Whitcombe J, William Glover S. How doctors make use of online, point-of-care clinical decision support systems: a case study of UpToDate©. Health Info Libr J 2013; 30 (01) 13-22
  • 8 Bennett NL, Casebeer LL, Kristofco RE, Strasser SM. Physicians' Internet information-seeking behaviors. J Contin Educ Health Prof 2004; 24 (01) 31-38
  • 9 Ellsworth MA, Homan JM, Cimino JJ, Peters SG, Pickering BW, Herasevich V. Point-of-care knowledge-based resource needs of clinicians: a survey from a large academic medical center. Appl Clin Inform 2015; 6 (02) 305-317
  • 10 Maggio LA, Aakre CA, Del Fiol G, Shellum J, Cook DA. Impact of clinicians' use of electronic knowledge resources on clinical and learning outcomes: systematic review and meta-analysis. J Med Internet Res 2019; 21 (07) e13315
  • 11 ABIM. ABIM open-book assessments will feature access to UpToDate®. Accessed October 11, 2021 at: https://www.abim.org/media-center/press-releases/abim-open-book-assessments-will-feature-access-to-uptodate.aspx
  • 12 Kwag KH, González-Lorenzo M, Banzi R, Bonovas S, Moja L. Providing doctors with high-quality information: an updated evaluation of web-based point-of-care information summaries. J Med Internet Res 2016; 18 (01) e15
  • 13 Johnson E, Emani VK, Ren J. Breadth of coverage, ease of use, and quality of mobile point-of-care tool information summaries: an evaluation. JMIR Mhealth Uhealth 2016; 4 (04) e117
  • 14 Alper BS, White DS, Ge B. Physicians answer more clinical questions and change clinical decisions more often with synthesized evidence: a randomized trial in primary care. Ann Fam Med 2005; 3 (06) 507-513
  • 15 Andrews R, Mehta N, Maypole J, Martin SA. Staying afloat in a sea of information: point-of-care resources. Cleve Clin J Med 2017; 84 (03) 225-235
  • 16 Walden RR, Woodward NJ, Wallace RL. Reevaluating point-of-care resources: community engagement in difficult collection choices. Med Ref Serv Q 2019; 38 (01) 22-30
  • 17 Shurtz S, Foster MJ. Developing and using a rubric for evaluating evidence-based medicine point-of-care tools. J Med Libr Assoc 2011; 99 (03) 247-254
  • 18 Charbonneau DH, James LN. DynaMed Plus®: an evidence-based clinical reference resource. Med Ref Serv Q 2018; 37 (02) 168-176
  • 19 Ammenwerth E. Technology acceptance models in health informatics: TAM and UTAUT. Stud Health Technol Inform 2019; 263: 64-71
  • 20 Holden RJ, Karsh B-T. The technology acceptance model: its past and its future in health care. J Biomed Inform 2010; 43 (01) 159-172
  • 21 Campbell R, Ash J. An evaluation of five bedside information products using a user-centered, task-oriented approach. J Med Libr Assoc 2006; 94 (04) 435-441
  • 22 ACP. MKSAP 17 digital is no longer available. Accessed October 11, 2021 at: https://mksap17.acponline.org/
  • 23 Dwan K, Li T, Altman DG, Elbourne D. CONSORT 2010 statement: extension to randomised crossover trials. BMJ 2019; 366: l4378
  • 24 Ketchum AM, Saleh AA, Jeong K. Type of evidence behind point-of-care clinical information products: a bibliometric analysis. J Med Internet Res 2011; 13 (01) e21
  • 25 Jeffery R, Navarro T, Lokker C, Haynes RB, Wilczynski NL, Farjou G. How current are leading evidence-based medical textbooks? An analytic survey of four online textbooks. J Med Internet Res 2012; 14 (06) e175
  • 26 Banzi R, Cinquini M, Liberati A. et al. Speed of updating online evidence based point of care summaries: prospective cohort analysis. BMJ 2011; 343: d5856
  • 27 Bradley-Ridout G, Nekolaichuk E, Jamieson T. et al. UpToDate versus DynaMed: a cross-sectional study comparing the speed and accuracy of two point-of-care information tools. J Med Libr Assoc 2021; 109 (03) 382-387
  • 28 Marshall JG, Sollenberger J, Easterby-Gannett S. et al. The value of library and information services in patient care: results of a multisite study. J Med Libr Assoc 2013; 101 (01) 38-46
  • 29 Goodyear-Smith F, Kerse N, Warren J, Arroll B. Evaluation of e-textbooks. DynaMed, MD Consult and UpToDate. Aust Fam Physician 2008; 37 (10) 878-882
  • 30 Djulbegovic B, Guyatt GH. Progress in evidence-based medicine: a quarter century on. Lancet 2017; 390 (10092): 415-423
  • 31 Maggio LA, Tannery NH, Chen HC, ten Cate O, O'Brien B. Evidence-based medicine training in undergraduate medical education: a review and critique of the literature published 2006-2011. Acad Med 2013; 88 (07) 1022-1028
  • 32 Kumaravel B, Hearn JH, Jahangiri L, Pollard R, Stocker CJ, Nunan D. A systematic review and taxonomy of tools for evaluating evidence-based medicine teaching in medical education. Syst Rev 2020; 9 (01) 91
  • 33 Tikkinen KAO, Guyatt GH. Understanding of research results, evidence summaries and their applicability-not critical appraisal-are core skills of medical curriculum. BMJ Evid Based Med 2021; 26 (05) 231-233
  • 34 Nielsen J, Landauer TK. A mathematical model of the finding of usability problems. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '93. New York City, NY: ACM Press; 1993: 206-213