CC BY-NC-ND 4.0 · Appl Clin Inform 2022; 13(02): 400-409
DOI: 10.1055/s-0042-1744549
Research Article

Improving COVID-19 Research of University Hospitals in Germany: Formative Usability Evaluation of the CODEX Feasibility Portal

Brita Sedlmayr
1   Institute for Medical Informatics and Biometry, Carl Gustav Carus Faculty of Medicine, Technische Universität Dresden, Dresden, Germany
,
Martin Sedlmayr
1   Institute for Medical Informatics and Biometry, Carl Gustav Carus Faculty of Medicine, Technische Universität Dresden, Dresden, Germany
,
Björn Kroll
2   IT Center for Clinical Research, University of Lübeck, Lübeck, Germany
,
Hans-Ulrich Prokosch
3   Department of Medical Informatics, Biometrics and Epidemiology, Chair of Medical Informatics, Friedrich-Alexander Universität Erlangen-Nürnberg, Erlangen, Germany
,
Julian Gruendner
3   Department of Medical Informatics, Biometrics and Epidemiology, Chair of Medical Informatics, Friedrich-Alexander Universität Erlangen-Nürnberg, Erlangen, Germany
,
Christina Schüttler
3   Department of Medical Informatics, Biometrics and Epidemiology, Chair of Medical Informatics, Friedrich-Alexander Universität Erlangen-Nürnberg, Erlangen, Germany
› Author Affiliations
Funding The study was conducted as part of NUM CODEX. NUM CODEX is funded by the German Federal Ministry of Education and Research (BMBF; FKZ 01KX2021).

Abstract

Background Within the German “Network University Medicine,” a portal is to be developed to enable researchers to query on novel coronavirus disease 2019 (COVID-19) data from university hospitals for assessing the feasibility of a clinical study.

Objectives The usability of a prototype for federated feasibility queries was evaluated to identify design strengths and weaknesses and derive improvement recommendations for further development.

Methods In the course of a remote usability test with the thinking-aloud method and posttask interviews, 15 clinical researchers evaluated the usability of a prototype of the Feasibility Portal. The identified usability problems were rated according to severity, and improvement recommendations were derived.

Results The design of the prototype was rated as simple, intuitive, and as usable with little effort. The usability test reported a total of 26 problems, 8 of these were rated as “critical.” Usability problems and revision recommendations focus primarily on improving the visual distinguishability of selected inclusion and exclusion criteria, enabling a flexible approach to criteria linking, and enhancing the free-text search.

Conclusion Improvement proposals were developed for these user problems which will guide further development and the adaptation of the portal to user needs. This is an important prerequisite for correct and efficient use in everyday clinical work in the future. Results can provide developers of similar systems with a good starting point for interface conceptualizations. The methodological approach/the developed test guideline can serve as a template for similar evaluations.

Protection of Human and Animal Subjects

The study was performed in compliance with the World Medical Association Declaration of Helsinki on Ethical Principles for Medical Research Involving Human Subjects, and was reviewed and approved by the Institutional Review Board at Friedrich-Alexander-Universität (FAU) Erlangen-Nürnberg (Germany; approval number: 85_21B). Participants were informed of the contents prior to study participation and voluntarily consented to participate.


Author Contributions

B.S. wrote the first version of the manuscript. B.S. and C.S. planned and conducted the usability study which was supervised by H.-U.P. and M.S.. J.G. held the team lead. B.S., C.S., J.G., and B.K. were significantly involved in the conceptual design and development of the evaluated prototype. C.S. and H.-U.P. were responsible for the recruitment of the participants. B.S. and C.S. analyzed all thinking aloud and interview protocols and the recorded screen videos. All authors read the first version of the manuscript and provided valuable suggestions for changes.




Publication History

Received: 10 September 2021

Accepted: 03 January 2022

Article published online:
20 April 2022

© 2022. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

 
  • References

  • 1 Wilson GM, Ball MJ, Szczesny P. et al. Health intelligence atlas: a core tool for public health intelligence. Appl Clin Inform 2021; 12 (04) 944-953
  • 2 Fareed N, Swoboda CM, Chen S, Potter E, Wu DTY, Sieck CJUS. U.S. COVID-19 state government public dashboards: an expert review. Appl Clin Inform 2021; 12 (02) 208-221
  • 3 Sullivan C, Wong I, Adams E. et al. Moving faster than the COVID-19 pandemic: the rapid, digital transformation of a public health system. Appl Clin Inform 2021; 12 (02) 229-236
  • 4 The DataLab.. OpenSAFELY. Accessed December 8, 2021 at: https://www.opensafely.org/about/
  • 5 Stanley B, Davis A, Jones R. et al; Platform C19 committee. Characteristics of patients in platform C19, a COVID-19 research database combining primary care electronic health record and patient reported information. PLoS One 2021; 16 (10) e0258689
  • 6 CODEX | COVID-19 Data Exchange Platform.. Accessed December 8, 2021 at: https://www.netzwerk-universitaetsmedizin.de/projekte/codex
  • 7 Charité-Universitätsmedizin Berlin.. Forschungsnetzwerk der Universitätsmedizin zu COVID-19 (“Netzwerk Universitätsmedizin”). Accessed December 8, 2021 at: https://www.netzwerk-universitaetsmedizin.de/
  • 8 Christoph J, Knell C, Bosserhoff A. et al. Usability and Suitability of the Omics-Integrating Analysis Platform tranSMART for Translational Research and Education. Appl Clin Inform 2017; 8 (04) 1173-1183
  • 9 Wozney L, McGrath PJ, Newton A. et al. Usability, learnability and performance evaluation of Intelligent Research and Intervention Software: a delivery platform for eHealth interventions. Health Informatics J 2016; 22 (03) 730-743
  • 10 Soto-Rey I, N'Dja A, Cunningham J. et al. User satisfaction evaluation of the EHR4CR query builder: a multisite patient count cohort system. BioMed Res Int 2015; 2015: 801436
  • 11 Mathew D, McKibbon KA, Lokker C, Colquhoun H. Engaging with a Wiki related to knowledge translation: a survey of WhatisKT Wiki users. J Med Internet Res 2014; 16 (01) e21
  • 12 Schüttler C, Prokosch H-U, Sedlmayr M, Sedlmayr B. Evaluation of three feasibility tools for identifying patient data and biospecimen availability: comparative usability study. JMIR Med Inform 2021; 9 (07) e25531
  • 13 Kuric E, Fernández JD, Drozd O. Knowledge graph exploration: a usability evaluation of query builders for laypeople. In: Acosta M, Cudré-Mauroux P, Maleshkova M, Pellegrini T, Sack H, Sure-Vetter Y, eds. Semantic Systems. The Power of AI and Knowledge Graphs. Switzerland: Springer International Publishing;; 2019: 326-342
  • 14 International Organization for Standardization. Ergonomics of human-system interaction - Part 210: Human-centred design for interactive systems (ISO Standard No. 9241–210:2019). Accessed February 24, 2022 at: https://www.iso.org/standard/77520.html
  • 15 Jaspers MWM, Steen T, van den Bos C, Geenen M. The think aloud method: a guide to user interface design. Int J Med Inform 2004; 73 (11,12): 781-795
  • 16 Tullis T, Fleischmann S, McNulty M, Cianchette C, Bergel M. An empirical comparison of lab and remote usability testing of Web sites. In: Usability Professionals' Association 2002 Conference Proceedings (CD-ROM). Bloomingdale, IL: Usability Professionals' Association;; 2002
  • 17 Andreasen MS, Nielsen HV, Schröder SO, Stage J. What happened to remote usability testing? An empirical study of three methods. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York: Association for Computing Machinery; 2007: 1405-1414
  • 18 Downie AS, Hancock M, Abdel Shaheed C. et al. An electronic clinical decision support system for the management of low back pain in community pharmacy: development and mixed methods feasibility study. JMIR Med Inform 2020; 8 (05) e17203
  • 19 Clausen CE, Leventhal BL, Nytrø Ø. et al. Testing an individualized digital decision assist system for the diagnosis and management of mental and behavior disorders in children and adolescents. BMC Med Inform Decis Mak 2020; 20 (01) 232
  • 20 Sass J, Bartschke A, Lehne M. et al. The German Corona Consensus Dataset (GECCO): a standardized dataset for COVID-19 research in university medicine and beyond. BMC Med Inform Decis Mak 2020; 20 (01) 341
  • 21 Helfferich C. Interviewplanung und Intervieworganisation. In: Die Qualität Qualitativer Daten: Manual Für Die Durchführung Qualitativer Interviews. VS Verlag für Sozialwissenschaften; 2009: 167-193 DOI: 10.1007/978-3-531-91858-7_6
  • 22 Nielsen J. heuristic evaluation. In: Nielsen J, Mack RL, eds. Usability Inspection Methods. New York, NY: John Wiley & Sons; 1994: 25-62
  • 23 Hultman G, McEwan R, Pakhomov S, Lindemann E, Skube S, Melton GB. Usability Evaluation of an Unstructured Clinical Document Query Tool for Researchers. AMIA Jt Summits Transl Sci 2018; 2017: 84-93
  • 24 Brooke J. SUS: a “quick and dirty” usability scale. In: Jordan PW, Thomas B, Weerdmeester BA, McClelland AL, eds. Usability Evaluation in Industry. Taylor and Francis; 1996: 189-194
  • 25 Nielsen J. Estimating the number of subjects needed for a thinking aloud test. Int J Hum Comput Stud 1994; 41 (03) 385-397
  • 26 Medical Informatics Initiative Germany.. ABIDE_MI. Accessed December 8, 2021 at: https://www.medizininformatik-initiative.de/en/node/609