Appl Clin Inform 2022; 13(01): 139-147
DOI: 10.1055/s-0041-1742216
Research Article

Comparing the Use of DynaMed and UpToDate by Physician Trainees in Clinical Decision-Making: A Randomized Crossover Trial

Sally L. Baxter
1   Health Department of Biomedical Informatics, University of California San Diego, La Jolla, California, United States
2   Division of Ophthalmology Informatics and Data Science, Viterbi Family Department of Ophthalmology, Shiley Eye Institute, University of California San Diego, La Jolla, California, United States
,
Lina Lander
3   Department of Family Medicine, University of California San Diego, La Jolla, California, United States
,
Brian Clay
4   Department of Medicine, University of California San Diego, La Jolla, California, United States
,
John Bell
4   Department of Medicine, University of California San Diego, La Jolla, California, United States
,
Kristen Hansen
3   Department of Family Medicine, University of California San Diego, La Jolla, California, United States
,
Amanda Walker
3   Department of Family Medicine, University of California San Diego, La Jolla, California, United States
,
Ming Tai-Seale
3   Department of Family Medicine, University of California San Diego, La Jolla, California, United States
› Author Affiliations
Funding S.L.B. was supported by the National Institutes of Health/National Library of Medicine (training grant T15LM011271), the NIH Office of the Director (grant DP5OD029610), and an unrestricted departmental grant from Research to Prevent Blindness.
 

Abstract

Background Costs vary substantially among electronic medical knowledge resources used for clinical decision support, warranting periodic assessment of institution-wide adoption.

Objectives To compare two medical knowledge resources, UpToDate and DynaMed Plus, regarding accuracy and time required to answer standardized clinical questions and user experience.

Methods A crossover trial design was used, wherein physicians were randomized to first use one of the two medical knowledge resources to answer six standardized questions. Following use of each resource, they were surveyed regarding their user experience. The percentage of accurate answers and time required to answer each question were recorded. The surveys assessed ease of use, enjoyment using the resource, quality of information, and ability to assess level of evidence. Tests of carry-over effects were performed. Themes were identified within open-ended survey comments regarding overall user experience.

Results Among 26 participating physicians, accuracy of answers differed by 4 percentage points or less. For all but one question, there were no significant differences in the time required for completion. Most participants felt both resources were easy to use, contained high quality of information, and enabled assessment of the level of evidence. A greater proportion of participants endorsed enjoyment of use with UpToDate (23/26, 88%) compared with DynaMed Plus (16/26, 62%). Themes from open-ended comments included interface/information presentation, coverage of clinical topics, search functions, and utility for clinical decision-making. The majority (59%) of open-ended comments expressed an overall preference for UpToDate, compared with 19% preferring DynaMed Plus.

Conclusion DynaMed Plus is noninferior to UpToDate with respect to ability to achieve accurate answers, time required for answering clinical questions, ease of use, quality of information, and ability to assess level of evidence. However, user experience was more positive with UpToDate. Future studies of electronic medical knowledge resources should continue to emphasize evaluation of usability and user experience.


#

Background and Significance

In a world of evolving medical practice and management guidelines that are constantly updated with new evidence, clinical care continues to be challenging for both practicing clinicians and trainees. Medical knowledge resources such as UpToDate and DynaMed Plus (which we will subsequently refer to as “DynaMed” for the sake of brevity) provide clinical decision support to answer clinical questions when clinicians may lack personal experience or where scientific evidence may be inconclusive. This is especially true for trainees who have not yet obtained a wide range of clinical experience. Several studies have demonstrated the common use of these resources among medical trainees.[1] [2] [3] [4] Even after training is complete, medical knowledge resources are used frequently by practicing physicians as well to help inform evidence-based clinical decision-making.[5] [6] [7] [8] [9] These medical knowledge resources allow rapid information retrieval about specific clinical questions without requiring clinicians to expend the time and effort to search, read, and synthesize primary published literature. A recent systematic review found that use of these knowledge resources was associated with a positive impact on clinician behaviors and patient effects (defined by the authors to include outcomes such as patient knowledge, length of stay, and patient symptoms), particularly in increased success in answering clinical questions.[10] Another illustration of the acceptance and importance of these resources is that some are allowed for physicians' use during board certification procedures in some specialties.[11] The availability of medical knowledge resources is expanding, with an increasing number of companies developing software in this space.[12] As the market matures, so does the editorial quality, evidence-based methodology, and volume of diseases and medical conditions covered. The content, however, tends to be similar among the various vendors.[10] [13]

Medical knowledge resources often also include some degree of synthesis or editorial input from experts,[14] and therefore may be viewed by users as providing expert opinions. This provides some confidence to clinicians that recommendations will be evidence-based or reflect expert knowledge, in contrast to more general (and nonvetted) sources of information such as Google or Wikipedia.[15] Prior studies have shown that clinicians searching for knowledge to inform clinical decision-making prefer synthesized information sources compared with original research.[9] Key features that make medical knowledge resources effective include ease of use with standardized formats and/or a summary for each topic, links to the original articles and concise synthesis of the information, availability of continuing medical education credits, freedom from advertisements, use of a strong evidence base (instead of expert opinion alone), ease of access, disclosure of any conflicts of interest, and allowance for both institutional and individual accounts. Cost considerations for these products are also becoming increasingly important in modern health care organizations in the environment of limited resources. Subsequently, a few organizations may be changing providers in part because of the rising subscription costs.[16]


#

Objectives

At the University of California San Diego (UCSD), our health system faced a similar choice, as subscription costs for UpToDate continued to increase each year. Periodic assessment of all software products is essential to determine whether they continue to meet user needs or whether alternative products should be evaluated, and institutional entities such as libraries and information services need to make informed decisions by surveying clinical staff and assessing the quality of available resources.[17] In 2018, we introduced DynaMed[18] alongside UpToDate, giving users a potential alternative medical knowledge resource. Because many health care providers and learners had extensive experience with UpToDate, we anticipated that the potential implementation of DynaMed would require some user-centered considerations and change management efforts. The objective of this study was to examine the comparative usability of DynaMed and UpToDate, with a secondary objective of assessing clinicians' and learners' willingness to adopt DynaMed in lieu of UpToDate. To inform our study, we used elements of the Technology Acceptance Model (TAM), a framework that has been commonly adapted to health information technology and includes elements such as perceived usefulness, perceived ease of use, attitude toward using, behavioral intention, and actual use,[19] [20] particularly in evaluating the perceived use and acceptance of the two medical knowledge resources. This was rooted in prior literature that demonstrated the importance of understanding not only the ability to complete specific tasks, but also attending to end-user preferences when evaluating medical knowledge resources.[21]


#

Methods

This study was approved by the UCSD Institutional Review Board. Users of UpToDate, including medical students, residents, and faculty, were invited to participate via email. There were no exclusion criteria based on level of training or specialty area. The study was conducted at the UCSD Health Information Services computer laboratory on a weekend morning.

Two physicians trained in biomedical informatics developed six clinical cases based on clinical vignettes in the Medical Knowledge Self-Assessment Program (MKSAP), a widely used resource for medical education.[22] The cases encompassed areas of medicine, surgery, reproductive medicine, and pediatrics. Participants were instructed to use medical knowledge resources to search for information relevant to the cases, even if they knew the correct answer to the clinical case based on the description and question. The correct answers to the clinical cases were based on standardized answers to corresponding MKSAP clinical vignettes. Both physicians involved in clinical case development for this study agreed upon the correct answer choices for all case questions prior to study initiation.

The study design was a crossover randomized trial ([Fig. 1]). We used the Consolidated Standards of Reporting Trials (CONSORT) 2010 Statement with the crossover trial extension for guiding reporting.[23] Participants were randomized in a 1:1 allocation ratio using a random sequence generator to first use one of the two medical knowledge resources to answer test questions on six cases, with one question per clinical case. Randomization was used to mitigate potential bias arising from participants choosing to select a particular resource first for answering the clinical case questions. Blinding of randomization was not possible. After participants completed searching for answers using the medical knowledge resource to which they were first assigned, they switched to the other application and searched for answers on the same cases. A washout period between usage of the two resources was not possible due to scheduling constraints, i.e., the same group of trainees would not have been available if a washout period of several days were used due to complex clinical rotation schedules. The same cases and questions were used for both resources to minimize possible bias introduced by variations in case content and difficulty. We determined the percentage of accurate answers by dividing the number of participants who answered each question correctly by the total number of participants who submitted an answer for that question. Participants also recorded the time required to answer each question, using a stopwatch function on their smartphones. t-Tests were used to compare the time needed to complete each question using DynaMed versus UpToDate. Statistical significance was defined as p < 0.05. Statistical analyses were performed using R (version 3.5.2).

Zoom Image
Fig. 1 Crossover randomized trial study design. Participants (n = 26) were randomized to use either DynaMed Plus first or UpToDate first. They completed surveys after using each resource.

After completing the cases using each resource, participants were asked to complete structured questions regarding ease of use, enjoyment using the medical knowledge resource, the quality of information, and ability to assess the level of evidence. At the end of the trial, participants answered open-ended questions on an anonymous online survey instrument regarding their overall experience using both UpToDate and DynaMed. Demographic questions included level of training and years of prior experience with UpToDate. The full survey instrument is available in [Supplementary Appendix A] (available in the online version). The survey was initially tested with a small group of physicians to confirm face validity prior to deployment in the study. Survey data were collected and managed via the Qualtrics platform (Qualtrics, Provo, Utah, United States).

Both quantitative and qualitative analyses were performed. For demographic data and structured survey items, descriptive statistics were generated such as mean and standard deviation (SD) or counts/frequencies where appropriate. We identified common themes from the open-ended survey items. Free text/open-ended comments were also analyzed to determine the proportion of participants with comments stating preference for DynaMed, preference for UpToDate, or neutral/no preference. These were determined based on qualitative analysis rather than computational tools.

We also evaluated for equivalence of any crossover effects. We analyzed whether the first medical knowledge resource exposure (i.e., DynaMed first or UpToDate first) had any effect on outcomes. Outcomes that were assessed included time needed to complete each clinical question, as well as responses to structured survey items. For survey responses, Likert scale responses were considered as integers for the purpose of this analysis. Crossover effects were evaluated using tests for equality of carry-over effects, with statistical significance defined as p < 0.05. When the p-value is below 0.05, that would suggest the carry-over effects differ between the groups, whereas p-values above 0.05 would suggest there were not significant carry-over effects.

There were no changes to eligibility criteria or details of the protocol (i.e., interventions or outcome measurements) after trial commencement. Sample size was based on convenience sampling of physicians in training willing to participate in the study and being available to participate (i.e., not on clinical service at that time). There were no losses to the trial given the short duration of the intervention. Analyses were conducted based on the original assigned groups; no participants switched groups after randomization.


#

Results

Study Population Characteristics

Twenty-six individuals participated in the crossover randomized trial (14/26 male, 54%; [Table 1]). Participants were evenly split between medical students, residents, and fellows. Prior years of experience using UpToDate ranged from 1 to 10 years, with a mean (SD) of 5.8 (2.5) years.

Table 1

General characteristics of study participants

DynaMed Plus first

(n = 12)

UpToDate first

(n = 14)

Overall

(n = 26)

Gender

 Female

7 (58.3%)

4 (28.6%)

11 (42.3%)

 Male

4 (33.3%)

10 (71.4%)

14 (53.8%)

 Decline to answer

1 (8.3%)

0 (0%)

1 (3.8%)

Level of training

 Medical student

4 (33.3%)

5 (35.7%)

9 (34.6%)

 Resident

4 (33.3%)

4 (28.6%)

8 (30.8%)

 Fellow

4 (33.3%)

5 (35.7%)

9 (34.6%)

Years of experience using UpToDate

 Mean (SD)

5.75 (2.30)

5.79 (2.78)

5.77 (2.52)

 Median [min, max]

5.50 [2.00, 9.00]

5.00 [1.00, 10.0]

5.00 [1.00, 10.0]


#

Accuracy of Answers to Clinical Cases

Overall, the accuracy of the participants' answers for the six case questions was similar, whether they used DynaMed or UpToDate ([Table 2]). For two cases, the percentages answering accurately were exactly the same. The difference in accuracy between those using DynaMed and UpToDate varied by 4 percentage points or less for three cases. In the one remaining case, there was a difference of 7 percentage points in accuracy, with a higher rate of accuracy achieved when participants used DynaMed ([Table 2]).

Table 2

Accuracy of answers to clinical questions using DynaMed Plus and UpToDate

Accurate answers among DynaMed Plus users,

N [a] (%)

Accurate answers among UpToDate users,

N (%)

Percent difference

(%)

Question 1

20/26 (77%)

19/24 (79%)

−2%

Question 2

2/26 (8%)

3/26 (11%)

−3%

Question 3

19/25 (76%)

19/25 (76%)

0%

Question 4

22/25 (88%)

22/25 (88%)

0%

Question 5

11/25 (44%)

9/24 (37%)

+7%

Question 6

17/26 (65%)

18/26 (69%)

−4%

a Denominators vary because of incomplete questionnaires (n = 26).



#

Time Required to Answer Clinical Case Questions

The mean time required to answer the clinical case questions was less than 5 minutes per question regardless of which medical knowledge resource was used ([Table 3]). Participants completed a total of six questions using each medical knowledge resource. For one question (Question 2), participants required 1.21 more minutes to answer when using DynaMed than when using UpToDate, which was statistically significant (p = 0.04). Of note, that same question was by far the least accurately answered among all questions (accuracy rate of 8% for DynaMed and 11% for UpToDate, see [Table 2]), possibly reflecting greater difficulty. For all other questions, there were no significant differences in the time required for completion ([Table 3]). Moreover, the medical knowledge resource that was used first did not exert any significant crossover effects on the time required to answer each question ([Table 3]).

Table 3

Time required to answer clinical case questions using DynaMed Plus and UpToDate

Mean time to answer question using DynaMed Plus

(minutes)

Mean time to answer question using UpToDate (minutes)

Difference in

mean times

(minutes)

T-statistic

and p-value

Crossover effect

(p-value)

Question 1

4.08

3.63

+0.45

t = 0.55 (p = 0.29)

p = 0.72

Question 2

3.26

2.05

+1.21

t = 1.79 (p = 0.04)

p = 0.96

Question 3

3.57

3.58

−0.01

t = −0.01 (p = 0.50)

p = 0.53

Question 4

3.40

2.55

+0.85

t = 1.55 (p = 0.06)

p = 0.94

Question 5

4.19

4.68

−0.49

t = −0.68 (p = 0.75)

p = 0.50

Question 6

3.24

4.00

−0.76

t = −1.00 (p = 0.75)

p = 0.23


#

Experience of Using the Products

The distribution of responses to the survey items is depicted in [Fig. 2]. The vast majority of participants agreed or strongly agreed with the statement that the resource was easy to use (20/26 [77%] for DynaMed, 22/26 [85%] for UpToDate). Participants also perceived both resources to have high quality of information (23/26 [88%] for DynaMed and 26/26 [100%] for UpToDate) and felt able to assess the level of evidence (21/26 [81%] for DynaMed and 22/26 [85%] for UpToDate). Responses to the statement “I enjoyed using the resource to look for answers” were mixed. More participants (23/26, 88%) agreed or strongly agreed with this statement when using UpToDate compared with when using DynaMed (16/26, 62%). Almost one-third (8/26, 31%) disagreed with this statement when using DynaMed. Only one participant (4%) disagreed with the statement regarding enjoyment of use when using UpToDate. However, there was evidence that the crossover effects for this survey item differed between groups (p = 0.006). Participants were much less likely to enjoy using DynaMed if they were in the randomization group assigned to use UpToDate first. Similarly, there were significant crossover effects for the statement about quality of information (p = 0.005), with lower opinions of the quality of information of DynaMed among participants exposed to UpToDate first. The other items did not demonstrate any evidence of significant crossover effects.

Zoom Image
Fig. 2 Comparison of UpToDate and DynaMed Plus with regard to ease of use, enjoyment of the software, quality of information, and ability to assess the level of evidence.

When asked whether they would be open to using DynaMed instead of UpToDate going forward, less than a quarter (6/26, 23%) were “definitely open,” but more than a third (9/26, 35%) were “somewhat open.” Taken together, a majority (15/26, 58%) were open to some extent to using DynaMed as a medical knowledge resource. However, a substantial proportion was still not open to switching to DynaMed from UpToDate, with 7 (27%) “somewhat not open” and 4 (15%) “definitely not open.”


#

Open-Ended Comments

All 26 (100%) participants responded with at least one open-ended comment, with a total of 37 comment entries between the two open-ended items. Themes that emerged from the comments included interface/information presentation, coverage of clinical topics, search functions, and utility for clinical decision-making.

Interface/Information Presentation

Many participants indicated they preferred the bullet-point format used by DynaMed making it “easier to peruse” (fellow) and easier “gathering the evidence in one spot” (fellow) compared with lengthy narratives typical of UpToDate, which were “wordy and long” (medical student). However, several participants also expressed distaste for the bullet-point format, saying that it detracted from readability and that they preferred prose (three residents and one medical student). One participant highlighted that the readability issue around bullet points could be more pronounced on mobile interfaces (fellow). Four participants (two medical students and two fellows) specifically recommended that DynaMed incorporate more tables to facilitate summary of information.


#

Content Coverage

The content coverage in DynaMed “impressed” three participants (medical student, resident, fellow), one of whom cited the “depth of data in terms of citations/sources” (fellow). However, three participants felt UpToDate to be “more thorough” and that “it would be hard to use DynaMed as a standalone source” (two residents, one fellow). Two participants did not feel confident with the conclusions outlined in DynaMed. The first (a medical student) stated, “DynaMed made me want to look at primary sources.” The other (a resident) stated, “I would probably end up just looking up articles in PubMed. This would be difficult to do in clinic.”


#

Search Functions

Five participants (three medical students, two fellows) felt that the search algorithm in DynaMed was inferior. Cited reasons included nonspecific results, disorganized topics, and “jerky scrolling” (medical student).


#

Utility for Clinical Decision-Making

Finally, several participants felt that UpToDate was superior in terms of clinical utility. One stated, “UpToDate allows me to better utilize the data and apply the information to a specific patient” (medical student). Another felt that DynaMed lacked “clear summary of recommendations for physicians, including the author's recommendations” (resident). In a similar vein, another participant highlighted the editorial input from UpToDate as particularly helpful for areas of controversy where clear guidelines do not exist: “In cases of controversy (e.g., prophylactic transfusions in pregnant sickle cell patient), DynaMed authors don't break the tie. UpToDate authors often say 'our practice' in many of their articles. I think this can be helpful because I tend to use UpToDate particularly for management of areas without much evidence. So having a luminary in a given field share their practices can be helpful” (fellow).


#

Overall Preference

Out of the 37 total open-ended comments provided by respondents, 22 (59%) expressed an overall preference for UpToDate, compared with only 7 (19%) which expressed an overall preference for DynaMed. The remaining 8 (22%) were neutral. The preference for UpToDate appeared to wane with higher training level: the vast majority of comments from medical students and residents (8/11, 73% in each group) indicated a preference for UpToDate, while only a third (5/15, 33%) of comments from fellows indicated a preference for UpToDate.


#
#
#

Discussion

Electronic medical knowledge resources are widely used forms of clinical decision support. While there are a growing number of vendors developing these products,[7] there are relatively few studies that compare these products against each other in terms of usability and user experience.[5] Here, we evaluated two leading vendors, UpToDate and DynaMed, across several dimensions via a crossover randomized trial involving medical trainees at various stages of training.

First, the percentage of accurate answers to clinical case questions and the time required to complete the questions were generally similar regardless of the resource used. Therefore, from these objective criteria of facilitating medical trainees' ability to answer clinical questions with regards to time and accuracy, the performance of DynaMed was noninferior to UpToDate. These results provide further support for findings from prior studies that also demonstrate DynaMed's noninferiority, and in some cases, superiority to other products including UpToDate. For example, Alper et al found that primary care clinicians using DynaMed (compared with using only their usual information sources) answered more questions and changed clinical decisions more often within the same search time, thereby improving efficiency of answering clinical questions.[14] Kwag et al's review of 26 products showed that DynaMed and UpToDate were two of the three highest-scoring products across multiple dimensions of evaluation, including content presentation, breadth of disease coverage, editorial quality, and evidence-based methodology.[12] A bibliometric analysis by Ketchum et al showed that DynaMed had the largest total number of references and the largest proportion of current references.[24] Jeffery et al[25] found that DynaMed had the highest proportion of topics from recently published articles in an evidence-rating service, while Banzi et al[26] found that DynaMed had the fastest updating speed compared with four other information summaries, including UpToDate. A recent study by Bradley-Ridout et al[27] found that while time required to find answers was shorter with UpToDate, the accuracy of answers was similar between UpToDate and DynaMed among family medicine and obstetrics/gynecology residents. Some of the subjective survey responses in our study were also consistent with these findings, as the proportion of participants who “agreed” or “strongly agreed” that the resource was easy to use, had high-quality of information, and offered ability to assess level of evidence was similar between DynaMed and UpToDate.

Over half of participants in our study were “open” or “somewhat open” to adopting DynaMed. However, participants were less likely to agree that they enjoyed using DynaMed, particularly for the group of participants who had been randomized to use UpToDate first. This was further illustrated by open-ended comments, the majority of which expressed a preference for UpToDate. This was consistent with the Bradley-Ridout et al's study[27] comparing UpToDate and DynaMed, in which resident physicians also expressed a strong preference for UpToDate. However, that study did not delve into the nuances of why participants had that preference. The comments in our study indicated a higher level of readability of UpToDate's prose format over DynaMed's bullet-point format, better content coverage, and superior search functions. Importantly, participants felt that UpToDate was also more useful for clinical decision-making. This was attributed by some to the presence of UpToDate authors' individual recommendations—essentially, they placed substantial value on expert opinion to “break the tie” when primary evidence was inconclusive.

There are several possible reasons why our study may have shown a strong preference for UpToDate despite similar performance in accuracy of answers and time to completion. Our institution had an extensive history of using UpToDate, with over 10 years of availability of the product prior to the time of the study. In addition, UpToDate is the most widely used provider of medical knowledge resources for clinical decision support based on a nationwide survey of over 16,000 physicians by Marshall et al.[28] This was reflected in the experience level of our study participants, who on average had almost 6 years of prior experience using UpToDate. Therefore, this prior knowledge and familiarity of UpToDate may have negatively affected the perceived usability of DynaMed, which others have posited as well.[27] While theoretically it would have been helpful to evaluate physicians without any prior experience of UpToDate, in practice this would have been nearly impossible given the widespread adoption of UpToDate in academic medical centers in the United States. Evaluating two products with vastly different market share is inherently challenging. Even outside the United States, there is strong market share of UpToDate. For example, Addison et al conducted surveys of physicians in England's National Health Service and found “an overwhelming preference” for UpToDate compared with alternative resources such as BestPractice and DynaMed.[7]

Another possible explanation for the strong preferences expressed was that all participants were physicians in training. Trainees are likely to utilize medical knowledge resources more frequently and depend on them to a greater extent compared with practicing physicians. Therefore, practicing physicians may not have as strong of preferences because they rely relatively less on these resources and more on their own fund of knowledge and experience.[29] Another strategy for future investigation may be to evaluate medical students at the very beginning of their training, as they would not be accustomed to using either medical knowledge resource and would not have yet established preferences. Evaluating these medical knowledge resources for trainee use will be increasingly important given an ever-growing emphasis on practicing and teaching evidence-based medicine.[30] [31] [32] Teaching trainees how to identify and apply secondary sources of trustworthy medicine, including the medical knowledge resources examined here, is critical given the time constraints associated with critically appraising a growing body of primary research literature.[33]

This analysis focused on desktop applications, but a future area of study would include analyzing mobile and tablet applications. A prior survey by Ellsworth et al found that only 10% of searches occurred on a mobile device or at home, but that analysis was conducted in 2015.[9] Over the last several years, smartphones and tablets have been increasingly utilized not just by the public, but also by health professionals, particularly as more electronic health record vendors develop mobile and tablet clients. A prior study conducted in 2016 found that the breadth of coverage and ease of use of several mobile medical knowledge resources were similar,[13] and further investigation may be warranted as more vendors develop mobile applications.

Limitations

Limitations of the study included small sample size (primarily deriving from scheduling constraints required for coordinating a group of trainees for prospective evaluation), lack of input from practicing physicians, and restriction to desktop applications. Similarly, due to time/scheduling constraints, the assessment was performed with a relatively small set of clinical cases. Specific data regarding costs of the resources were not available for analysis and dissemination due to existing institutional nondisclosure agreements regarding contract negotiations with the vendors. Although the sample size limited the power to detect differences between the medical knowledge resources, general trends could still be examined. Moreover, the open-ended comments provided a venue for detailed insights and organized around a few key themes. It is unclear whether a larger sample size would have yielded novel thematic content and insights, especially as prior studies of usability have demonstrated that 10 users would be sufficient to identify 97% of the potential usability issues in a software.[34] Finally, in the context of the TAM framework, our study included elements related to perceived usefulness, perceived ease of use, and attitude (i.e., openness to adopting DynaMed), but did not formally assess behavioral intention to use or actual use.


#
#

Conclusion

Our analysis demonstrated noninferiority of DynaMed to UpToDate with respect to ability to achieve accurate answers, time required for answering clinical questions, ease of use, quality of information, and ability to assess level of evidence among medical trainees. However, user experience was more positive with UpToDate, leading to a majority of medical trainees still expressing a preference for UpToDate over DynaMed. Ultimately, our institution decided to continue licensing UpToDate despite rising costs, given end-user preferences and the broad familiarity with and support of the product by users. Future studies of medical knowledge resources should continue to emphasize evaluation of usability and user experience, in addition to costs, as these have proven to be highly influential in adoption. These factors may be perhaps even more influential than factors such as content coverage, volume and recency of references, and level of evidence.


#

Clinical Relevance Statement

Electronic medical knowledge resources are commonly used for clinical decision support. In this study, we describe the results of a crossover randomized trial wherein physicians used and compared two common electronic medical knowledge resources to answer standardized clinical case questions. Understanding the relative strengths and weaknesses of various electronic medical knowledge resources for clinical decision support is important to inform institutional decisions regarding adoption of such resources, particularly because they are associated with substantial costs.


#

Multiple Choice Questions

  1. Which of the following is the most widely used electronic medical knowledge resource for clinical decision support in the United States?

    • DynaMed Plus.

    • UpToDate.

    • Epocrates.

    • Access Medicine.

    Correct Answer: The correct answer is option b, UpToDate. According to a nationwide survey of over 16,000 physicians described by Marshall et al, UpToDate is the most widely used provider of medical knowledge resources for clinical decision support.

  2. In this study, the authors concluded that differences in which of the following aspects primarily drove user preferences toward UpToDate?

    • Time required to find answers to questions.

    • The likelihood of being able to answer questions correctly.

    • The quality of information.

    • The user experience and availability of expert opinions.

    • The ability to assess the level of evidence.

    Correct Answer: The correct answer is option d, the user experience and availability of expert opinions. The remaining answer choices describe qualities that were felt to be roughly equivalent between the two resources in this study. The majority of open-ended comments indicated a preference for UpToDate due to factors such as higher level of readability, superior search functions, and availability of expert opinions to provide input on clinical decision-making when primary evidence was inconclusive.


#
#

Conflict of Interest

None declared.

Acknowledgments

L.L., B.C., J.B., and M.T-.S. conceived and designed the study. B.C. and J.B. designed the clinical case questions. S.L.B., L.L., K.H., and A.W. participated in data collection, analysis, and interpretation. S.L.B. and L.L. drafted the manuscript. All authors critically revised the manuscript. The authors thank all the study participants. The funder did not have any involvement in the review or approval of the manuscript for publication. The other authors do not have any financial disclosures.

Protection of Human and Animal Subjects

The study was performed in compliance with the World Medical Association Declaration of Helsinki on Ethical Principles for Medical Research Involving Human Subjects, and was reviewed and approved by the University of California San Diego Institutional Review Board.


Supplementary Material

  • References

  • 1 Scaffidi MA, Khan R, Wang C. et al. Comparison of the impact of Wikipedia, UpToDate, and a digital textbook on short-term knowledge acquisition among medical students: randomized controlled trial of three web-based resources. JMIR Med Educ 2017; 3 (02) e20
  • 2 Lander B, Balka E. Exploring how evidence is used in care through an organizational ethnography of two teaching hospitals. J Med Internet Res 2019; 21 (03) e10769
  • 3 O'Carroll AM, Westby EP, Dooley J, Gordon KE. Information-seeking behaviors of medical students: a cross-sectional web-based survey. JMIR Med Educ 2015; 1 (01) e4
  • 4 Loda T, Erschens R, Junne F, Stengel A, Zipfel S, Herrmann-Werner A. Undergraduate medical students' search for health information online: explanatory cross-sectional study. JMIR Med Inform 2020; 8 (03) e16279
  • 5 Graber MA, Randles BD, Ely JW, Monnahan J. Answering clinical questions in the ED. Am J Emerg Med 2008; 26 (02) 144-147
  • 6 Ebell MH. How to find answers to clinical questions. Am Fam Physician 2009; 79 (04) 293-296
  • 7 Addison J, Whitcombe J, William Glover S. How doctors make use of online, point-of-care clinical decision support systems: a case study of UpToDate©. Health Info Libr J 2013; 30 (01) 13-22
  • 8 Bennett NL, Casebeer LL, Kristofco RE, Strasser SM. Physicians' Internet information-seeking behaviors. J Contin Educ Health Prof 2004; 24 (01) 31-38
  • 9 Ellsworth MA, Homan JM, Cimino JJ, Peters SG, Pickering BW, Herasevich V. Point-of-care knowledge-based resource needs of clinicians: a survey from a large academic medical center. Appl Clin Inform 2015; 6 (02) 305-317
  • 10 Maggio LA, Aakre CA, Del Fiol G, Shellum J, Cook DA. Impact of clinicians' use of electronic knowledge resources on clinical and learning outcomes: systematic review and meta-analysis. J Med Internet Res 2019; 21 (07) e13315
  • 11 ABIM. ABIM open-book assessments will feature access to UpToDate®. Accessed October 11, 2021 at: https://www.abim.org/media-center/press-releases/abim-open-book-assessments-will-feature-access-to-uptodate.aspx
  • 12 Kwag KH, González-Lorenzo M, Banzi R, Bonovas S, Moja L. Providing doctors with high-quality information: an updated evaluation of web-based point-of-care information summaries. J Med Internet Res 2016; 18 (01) e15
  • 13 Johnson E, Emani VK, Ren J. Breadth of coverage, ease of use, and quality of mobile point-of-care tool information summaries: an evaluation. JMIR Mhealth Uhealth 2016; 4 (04) e117
  • 14 Alper BS, White DS, Ge B. Physicians answer more clinical questions and change clinical decisions more often with synthesized evidence: a randomized trial in primary care. Ann Fam Med 2005; 3 (06) 507-513
  • 15 Andrews R, Mehta N, Maypole J, Martin SA. Staying afloat in a sea of information: point-of-care resources. Cleve Clin J Med 2017; 84 (03) 225-235
  • 16 Walden RR, Woodward NJ, Wallace RL. Reevaluating point-of-care resources: community engagement in difficult collection choices. Med Ref Serv Q 2019; 38 (01) 22-30
  • 17 Shurtz S, Foster MJ. Developing and using a rubric for evaluating evidence-based medicine point-of-care tools. J Med Libr Assoc 2011; 99 (03) 247-254
  • 18 Charbonneau DH, James LN. DynaMed Plus®: an evidence-based clinical reference resource. Med Ref Serv Q 2018; 37 (02) 168-176
  • 19 Ammenwerth E. Technology acceptance models in health informatics: TAM and UTAUT. Stud Health Technol Inform 2019; 263: 64-71
  • 20 Holden RJ, Karsh B-T. The technology acceptance model: its past and its future in health care. J Biomed Inform 2010; 43 (01) 159-172
  • 21 Campbell R, Ash J. An evaluation of five bedside information products using a user-centered, task-oriented approach. J Med Libr Assoc 2006; 94 (04) 435-441
  • 22 ACP. MKSAP 17 digital is no longer available. Accessed October 11, 2021 at: https://mksap17.acponline.org/
  • 23 Dwan K, Li T, Altman DG, Elbourne D. CONSORT 2010 statement: extension to randomised crossover trials. BMJ 2019; 366: l4378
  • 24 Ketchum AM, Saleh AA, Jeong K. Type of evidence behind point-of-care clinical information products: a bibliometric analysis. J Med Internet Res 2011; 13 (01) e21
  • 25 Jeffery R, Navarro T, Lokker C, Haynes RB, Wilczynski NL, Farjou G. How current are leading evidence-based medical textbooks? An analytic survey of four online textbooks. J Med Internet Res 2012; 14 (06) e175
  • 26 Banzi R, Cinquini M, Liberati A. et al. Speed of updating online evidence based point of care summaries: prospective cohort analysis. BMJ 2011; 343: d5856
  • 27 Bradley-Ridout G, Nekolaichuk E, Jamieson T. et al. UpToDate versus DynaMed: a cross-sectional study comparing the speed and accuracy of two point-of-care information tools. J Med Libr Assoc 2021; 109 (03) 382-387
  • 28 Marshall JG, Sollenberger J, Easterby-Gannett S. et al. The value of library and information services in patient care: results of a multisite study. J Med Libr Assoc 2013; 101 (01) 38-46
  • 29 Goodyear-Smith F, Kerse N, Warren J, Arroll B. Evaluation of e-textbooks. DynaMed, MD Consult and UpToDate. Aust Fam Physician 2008; 37 (10) 878-882
  • 30 Djulbegovic B, Guyatt GH. Progress in evidence-based medicine: a quarter century on. Lancet 2017; 390 (10092): 415-423
  • 31 Maggio LA, Tannery NH, Chen HC, ten Cate O, O'Brien B. Evidence-based medicine training in undergraduate medical education: a review and critique of the literature published 2006-2011. Acad Med 2013; 88 (07) 1022-1028
  • 32 Kumaravel B, Hearn JH, Jahangiri L, Pollard R, Stocker CJ, Nunan D. A systematic review and taxonomy of tools for evaluating evidence-based medicine teaching in medical education. Syst Rev 2020; 9 (01) 91
  • 33 Tikkinen KAO, Guyatt GH. Understanding of research results, evidence summaries and their applicability-not critical appraisal-are core skills of medical curriculum. BMJ Evid Based Med 2021; 26 (05) 231-233
  • 34 Nielsen J, Landauer TK. A mathematical model of the finding of usability problems. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '93. New York City, NY: ACM Press; 1993: 206-213

Address for correspondence

Sally L. Baxter, MD, MSc
University of California San Diego
9415 Campus Point Drive MC0946, La Jolla, CA 92093
United States   

Publication History

Received: 18 August 2021

Accepted: 04 December 2021

Article published online:
02 February 2022

© 2022. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 Scaffidi MA, Khan R, Wang C. et al. Comparison of the impact of Wikipedia, UpToDate, and a digital textbook on short-term knowledge acquisition among medical students: randomized controlled trial of three web-based resources. JMIR Med Educ 2017; 3 (02) e20
  • 2 Lander B, Balka E. Exploring how evidence is used in care through an organizational ethnography of two teaching hospitals. J Med Internet Res 2019; 21 (03) e10769
  • 3 O'Carroll AM, Westby EP, Dooley J, Gordon KE. Information-seeking behaviors of medical students: a cross-sectional web-based survey. JMIR Med Educ 2015; 1 (01) e4
  • 4 Loda T, Erschens R, Junne F, Stengel A, Zipfel S, Herrmann-Werner A. Undergraduate medical students' search for health information online: explanatory cross-sectional study. JMIR Med Inform 2020; 8 (03) e16279
  • 5 Graber MA, Randles BD, Ely JW, Monnahan J. Answering clinical questions in the ED. Am J Emerg Med 2008; 26 (02) 144-147
  • 6 Ebell MH. How to find answers to clinical questions. Am Fam Physician 2009; 79 (04) 293-296
  • 7 Addison J, Whitcombe J, William Glover S. How doctors make use of online, point-of-care clinical decision support systems: a case study of UpToDate©. Health Info Libr J 2013; 30 (01) 13-22
  • 8 Bennett NL, Casebeer LL, Kristofco RE, Strasser SM. Physicians' Internet information-seeking behaviors. J Contin Educ Health Prof 2004; 24 (01) 31-38
  • 9 Ellsworth MA, Homan JM, Cimino JJ, Peters SG, Pickering BW, Herasevich V. Point-of-care knowledge-based resource needs of clinicians: a survey from a large academic medical center. Appl Clin Inform 2015; 6 (02) 305-317
  • 10 Maggio LA, Aakre CA, Del Fiol G, Shellum J, Cook DA. Impact of clinicians' use of electronic knowledge resources on clinical and learning outcomes: systematic review and meta-analysis. J Med Internet Res 2019; 21 (07) e13315
  • 11 ABIM. ABIM open-book assessments will feature access to UpToDate®. Accessed October 11, 2021 at: https://www.abim.org/media-center/press-releases/abim-open-book-assessments-will-feature-access-to-uptodate.aspx
  • 12 Kwag KH, González-Lorenzo M, Banzi R, Bonovas S, Moja L. Providing doctors with high-quality information: an updated evaluation of web-based point-of-care information summaries. J Med Internet Res 2016; 18 (01) e15
  • 13 Johnson E, Emani VK, Ren J. Breadth of coverage, ease of use, and quality of mobile point-of-care tool information summaries: an evaluation. JMIR Mhealth Uhealth 2016; 4 (04) e117
  • 14 Alper BS, White DS, Ge B. Physicians answer more clinical questions and change clinical decisions more often with synthesized evidence: a randomized trial in primary care. Ann Fam Med 2005; 3 (06) 507-513
  • 15 Andrews R, Mehta N, Maypole J, Martin SA. Staying afloat in a sea of information: point-of-care resources. Cleve Clin J Med 2017; 84 (03) 225-235
  • 16 Walden RR, Woodward NJ, Wallace RL. Reevaluating point-of-care resources: community engagement in difficult collection choices. Med Ref Serv Q 2019; 38 (01) 22-30
  • 17 Shurtz S, Foster MJ. Developing and using a rubric for evaluating evidence-based medicine point-of-care tools. J Med Libr Assoc 2011; 99 (03) 247-254
  • 18 Charbonneau DH, James LN. DynaMed Plus®: an evidence-based clinical reference resource. Med Ref Serv Q 2018; 37 (02) 168-176
  • 19 Ammenwerth E. Technology acceptance models in health informatics: TAM and UTAUT. Stud Health Technol Inform 2019; 263: 64-71
  • 20 Holden RJ, Karsh B-T. The technology acceptance model: its past and its future in health care. J Biomed Inform 2010; 43 (01) 159-172
  • 21 Campbell R, Ash J. An evaluation of five bedside information products using a user-centered, task-oriented approach. J Med Libr Assoc 2006; 94 (04) 435-441
  • 22 ACP. MKSAP 17 digital is no longer available. Accessed October 11, 2021 at: https://mksap17.acponline.org/
  • 23 Dwan K, Li T, Altman DG, Elbourne D. CONSORT 2010 statement: extension to randomised crossover trials. BMJ 2019; 366: l4378
  • 24 Ketchum AM, Saleh AA, Jeong K. Type of evidence behind point-of-care clinical information products: a bibliometric analysis. J Med Internet Res 2011; 13 (01) e21
  • 25 Jeffery R, Navarro T, Lokker C, Haynes RB, Wilczynski NL, Farjou G. How current are leading evidence-based medical textbooks? An analytic survey of four online textbooks. J Med Internet Res 2012; 14 (06) e175
  • 26 Banzi R, Cinquini M, Liberati A. et al. Speed of updating online evidence based point of care summaries: prospective cohort analysis. BMJ 2011; 343: d5856
  • 27 Bradley-Ridout G, Nekolaichuk E, Jamieson T. et al. UpToDate versus DynaMed: a cross-sectional study comparing the speed and accuracy of two point-of-care information tools. J Med Libr Assoc 2021; 109 (03) 382-387
  • 28 Marshall JG, Sollenberger J, Easterby-Gannett S. et al. The value of library and information services in patient care: results of a multisite study. J Med Libr Assoc 2013; 101 (01) 38-46
  • 29 Goodyear-Smith F, Kerse N, Warren J, Arroll B. Evaluation of e-textbooks. DynaMed, MD Consult and UpToDate. Aust Fam Physician 2008; 37 (10) 878-882
  • 30 Djulbegovic B, Guyatt GH. Progress in evidence-based medicine: a quarter century on. Lancet 2017; 390 (10092): 415-423
  • 31 Maggio LA, Tannery NH, Chen HC, ten Cate O, O'Brien B. Evidence-based medicine training in undergraduate medical education: a review and critique of the literature published 2006-2011. Acad Med 2013; 88 (07) 1022-1028
  • 32 Kumaravel B, Hearn JH, Jahangiri L, Pollard R, Stocker CJ, Nunan D. A systematic review and taxonomy of tools for evaluating evidence-based medicine teaching in medical education. Syst Rev 2020; 9 (01) 91
  • 33 Tikkinen KAO, Guyatt GH. Understanding of research results, evidence summaries and their applicability-not critical appraisal-are core skills of medical curriculum. BMJ Evid Based Med 2021; 26 (05) 231-233
  • 34 Nielsen J, Landauer TK. A mathematical model of the finding of usability problems. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '93. New York City, NY: ACM Press; 1993: 206-213

Zoom Image
Fig. 1 Crossover randomized trial study design. Participants (n = 26) were randomized to use either DynaMed Plus first or UpToDate first. They completed surveys after using each resource.
Zoom Image
Fig. 2 Comparison of UpToDate and DynaMed Plus with regard to ease of use, enjoyment of the software, quality of information, and ability to assess the level of evidence.