CC BY-NC-ND 4.0 · Methods Inf Med 2023; 62(S 01): e10-e18
DOI: 10.1055/s-0042-1760249
Original Article

We Know What You Agreed To, Don't We?—Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project

Henriette Rau
1   Trusted Third Party of the University Medicine Greifswald, Greifswald, Germany
Dana Stahl
1   Trusted Third Party of the University Medicine Greifswald, Greifswald, Germany
Anna-Juliana Reichel
1   Trusted Third Party of the University Medicine Greifswald, Greifswald, Germany
Martin Bialke
2   Institute for Community Medicine Section Epidemiology of Health Care and Community Health, University Medicine Greifswald, Greifswald, Germany
Thomas Bahls
2   Institute for Community Medicine Section Epidemiology of Health Care and Community Health, University Medicine Greifswald, Greifswald, Germany
Wolfgang Hoffmann
2   Institute for Community Medicine Section Epidemiology of Health Care and Community Health, University Medicine Greifswald, Greifswald, Germany
› Author Affiliations
Funding The BFCC project was funded by the EU Interreg Baltic Sea Programme 2014-2020 (grant number #R001). gICS was developed as a part of the research grant program “Information infrastructure for research data” (grant number HO 1937/2-1) funded by the German Research Foundation (DFG).


Introduction The informed consent is the legal basis for research with human subjects. Therefore, the consent form (CF) as legally binding document must be valid, that is, be completely filled-in stating the person's decision clearly and signed by the respective person. However, especially paper-based CFs might have quality issues and the transformation into machine-readable information could add to low quality. This paper evaluates the quality and arising quality issues of paper-based CFs using the example of the Baltic Fracture Competence Centre (BFCC) fracture registry. It also evaluates the impact of quality assurance (QA) measures including giving site-specific feedback. Finally, it answers the question whether manual data entry of patients' decisions by clinical staff leads to a significant error rate in digitalized paper-based CFs.

Methods Based on defined quality criteria, monthly QA including source data verification was conducted by two individual reviewers since the start of recruitment in December 2017. Basis for the analyses are the CFs collected from December 2017 until February 2019 (first recruitment period).

Results After conducting QA internally, the sudden increase of quality issues in May 2018 led to site-specific feedback reports and follow-up training regarding the CFs' quality starting in June 2018. Specific criteria and descriptions on how to correct the CFs helped in increasing the quality in a timely matter. Most common issues were missing pages, decisions regarding optional modules, and signature(s). Since patients' datasets without valid CFs must be deleted, QA helped in retaining 65 datasets for research so that the final datapool consisted of 840 (99.29%) patients.

Conclusion All quality issues could be assigned to one predefined criterion. Using the example of the BFCC fracture registry, CF-QA proved to significantly increase CF quality and help retain the number of available datasets for research. Consequently, the described quality indicators, criteria, and QA processes can be seen as the best practice approach.

Statement of Ethical Approval

All registry sites have received a positive vote from their local ethics committee and, thus, ethical approval.

Authors' Contributions

H.R. drafted the manuscript. H.R. and D.S. were involved in the conception and implementation of the CF-QA. A.J.R. and H.R. conducted the CF-QA including feedback to partner hospitals and provided numbers and statistics regarding the BFCC CFs. M.B. and H.R. prepared the CF content in coordination with partners, data protection officers, and ethics committees, and M.B. implemented the digital CF templates. W.H. advised the clinical implementation of the consent process. T.B. was responsible for coordinating all work packages of the University Medicine Greifswald within the BFCC project and revised the manuscript critically. All authors read and approved the final manuscript.

Publication History

Received: 29 June 2022

Accepted: 04 October 2022

Article published online:
09 January 2023

© 2023. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 European Parliament, Council of the European Union. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). Official J of the Eur Union L 119 2016; 1-88
  • 2 World Medicine Association. Declaration of Helsinki. Ethical Principles for Medical Research Involving Human Subjects. Fortaleza, Brazil: WMA General Assembly; 2013
  • 3 Vogele D, Schöffski O, Efinger K, Schmidt SA, Beer M, Kildal D. Analysis of documented informed consent forms for computed tomography: completeness and data quality in four clinics. Radiologe 2020; 60 (02) 162-168
  • 4 BFCC project. Transnational Fracture Registry Platform. Published 2022. Accessed June 28, 2022 at:
  • 5 Pommerening K, Drepper J, Helbing K, Ganslandt T. Leitfaden zum Datenschutz in medizinischen Forschungsprojekten - Generische Lösungen der TMF 2.0. Vol Bd. 11: Medizinisch Wissenschaftliche Verlagsgesellschaft; 2014
  • 6 Bialke M, Penndorf P, Wegner T. et al. A workflow-driven approach to integrate generic software modules in a Trusted Third Party. J Transl Med 2015; 13: 176
  • 7 European Medicines Agency. Guideline for good clinical practice E6(R2)-Step 5, EMA/CHMP/ICH/135/1995, Committee for Human Medicinal Products (editors), 2016
  • 8 Nonnemacher M, Nasseh D, Stausberg J. Datenqualität in der medizinischen Forschung. Leitlinie zum adaptiven Management von Datenqualität in Kohortenstudien und Registern. 4 ed. Berlin: Medizinisch Wissenschaftliche Verlagsgesellschaft; 2014
  • 9 Dente CJ, Ashley DW, Dunne JR. et al; GRIT Study Group. Heterogeneity in trauma registry data quality: implications for regional and national performance improvement in trauma. J Am Coll Surg 2016; 222 (03) 288-295
  • 10 Altreuther M, Menyhei G. International validation of the Danish Vascular Registry Karbase: a vascunet report. Eur J Vasc Endovasc Surg 2019; 58 (04) 609-613
  • 11 Fox KAA, Gersh BJ, Traore S. et al; GARFIELD-AF Investigators. Evolving quality standards for large-scale registries: the GARFIELD-AF experience. Eur Heart J Qual Care Clin Outcomes 2017; 3 (02) 114-122
  • 12 Kodra Y, Weinbach J, Posada-de-la-Paz M. et al. Recommendations for improving the quality of rare disease registries. Int J Environ Res Public Health 2018; 15 (08) 1644
  • 13 Rau H, Geidel L, Bialke M. et al. The generic Informed Consent Service gICS®: implementation and benefits of a modular consent software tool to master the challenge of electronic consent management in research. J Transl Med 2020; 18 (01) 287