CC BY 4.0 · ACI open 2021; 05(02): e59-e66
DOI: 10.1055/s-0041-1732406
Research Article

Researcher Perceptions of a Self-Service Online Portal to Facilitate Volunteer Recruitment into Clinical Trials

Srinivas Emani
1   Division of General Internal Medicine, Department of Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts, United States
2   Department of Medicine, Laboratory of Computer Science, Massachusetts General Hospital, Boston, Massachusetts, United States
,
Yichuan Grace Hsieh
3   Department of Medicine, Laboratory of Computer Science, Massachusetts General Hospital, Harvard Medical School, Boston, Massachusetts, United States
4   Department of Medicine, Harvard Medical School, Boston, Massachusetts, United States
,
Greg Estey
2   Department of Medicine, Laboratory of Computer Science, Massachusetts General Hospital, Boston, Massachusetts, United States
,
Holly M. Parker
2   Department of Medicine, Laboratory of Computer Science, Massachusetts General Hospital, Boston, Massachusetts, United States
,
Xiaofeng Zhang
2   Department of Medicine, Laboratory of Computer Science, Massachusetts General Hospital, Boston, Massachusetts, United States
,
Karen Donelan
5   Department of Medicine, Health Policy Research Center, Mongan Institute, Massachusetts General Hospital, Harvard Medical School, Boston, Massachusetts, United States
,
Jeanhee A. Chung
3   Department of Medicine, Laboratory of Computer Science, Massachusetts General Hospital, Harvard Medical School, Boston, Massachusetts, United States
4   Department of Medicine, Harvard Medical School, Boston, Massachusetts, United States
› Author Affiliations
Funding This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.
 

Abstract

Background Recruitment of volunteers is a major challenge for clinical trials. There has been increasing development and use of Internet-based portals in recruitment for clinical research. There has been little research on researcher use and perceptions of these portals.

Objectives This study evaluated researcher perceptions of use of Rally, an Internet-based portal for clinical trial volunteer recruitment.

Methods A cross-sectional survey was developed and implemented to understand researcher perceptions. From theoretical models of information technology use, the survey adopted items in four domains: ease of use, usefulness, facilitating conditions, and self-efficacy. The dependent variable was researchers' behavioral intention to use Rally. The survey captured characteristics of researchers such as gender, age, and role. It was implemented using the REDCap survey tool. An email invitation followed by three reminders was sent to researchers. A hierarchical regression model was applied to assess predictors of behavioral intention.

Results The survey response rate was 35.6% (152 surveys received from 427 contacted researchers). In the hierarchical regression model, facilitating conditions and self-efficacy predicted behavioral intention (F (4,94) = 6.478; p <0.001). The model explained 21.6% of the variance in behavioral intention (R-square change = 21.3%, p <0.001).

Conclusion Facilitating conditions and self-efficacy predicted researchers' behavioral intention to use Rally for volunteer recruitment into clinical trials. Future research should document best practices and strategies for enhancing researcher use of online portals for volunteer recruitment.


#

Introduction

Accelerating contact with potential study recruits and improving methods of engagement remain persistent needs for clinical researchers.[1] [2] [3] [4] Use of web technologies to support study recruits and their representatives in finding research opportunities and engaging them in research has lagged behind the growth of general Internet use by the American public and is discordant with the heavy use of the Internet to research health-related topics.[5] [6]

While there is a considerable interest in literature about tools and algorithms that match eligibility criteria against patient databases, literature related to the use of web portals and other web-based technologies in study recruitment is more limited. Today, the prevailing digital clinical research catalog and web-based resource is ClinicalTrials.gov (clinicaltrials.gov).[7] Maintained by the National Library of Medicine and the National Institutes of Health, clinicaltrials.gov serves as a registry of publicly and privately supported research studies across a broad range of diseases and conditions for the purposes of providing clinical trial information to the public and sharing clinical trials results; indeed, registration with clinicaltrials.gov is now mandatory for many trials (required by law and for journal publication).[8] Through the comprehensiveness of the underlying digital catalog and its well-developed application programming interface, clinicaltrials.gov has become a major source compendium for clinical research institutions increasingly interested in creating institution-specific, research web portals, to inform their local communities about research opportunities, often as a segue to recruitment.[9] [10]

Institution-specific listings created from study description information culled from clinicaltrials.gov require no additional effort from busy research teams. Why, then, build a study catalog that requires research teams to create additional descriptive content? Will those teams perceive any value in this effort? Sourcing trial descriptions solely from clinicaltrials.gov can present barriers to potential study volunteers who will use the descriptions to decide if they are interested in volunteering for a study; an analysis comparing its readability to two other health care text corpora concluded that “ClinicalTrials.gov trial descriptions are the most difficult corpus, on average requiring 18 years of education in order to proficiently read and comprehend.”[11] The purpose of clinicaltrials.gov is to offer a comprehensive registry of clinical studies—it was not intended to be used to inform or engage potential volunteers for the purposes of recruitment—communication which arguably relies on the inclusion of the appropriate, actionable information, on readability, and on striking the right tone with the potential volunteer. During the ClinicalTrials.gov modernization public meeting on April 30, 2020 (describing a planned 4–5 years site modernization effort) the panel on web functionality highlighted that “the presentation on ClinicalTrials.gov can be made more user-friendly by the use of graphics and/or lay language.”[12]

Recognition of these limitations led to the creation of Rally (https://rally.partners.org/) as a publicly available, searchable compendium of research opportunities across the hospitals in the healthcare system covered by Rally, to be based on study descriptions newly authored by research staff and written expressly for the potential volunteer. A wizard-like interface guides researchers through supplying relevant information, providing tips and reminders of their audience and purpose along the way. Using Rally's public interface, potential volunteers can search recruiting trials, find information about and filter studies on practical details that might help them to decide if a study is right for them (e.g., how much time and/or how many visits the study will require, whether travel is required, whether it involves an overnight stay or will require blood draws or another invasive procedure), respond to screening questions and share their information directly with research teams.

Rally was first launched in October 2016 (Appendix A). As of December 31, 2018, more than 740 research projects had been published on the site, with an average of 250 projects actively recruiting through the site at any time, representing approximately 85% of the studies at the institution that are most likely to be recruiting from the public. The number of researchers using Rally steadily increased from the time of its launch, with a total of 1,760 researchers using the site overall and an average of approximately 300 unique researchers monthly. Of those, approximately 21% were new users every month. During the first 27 months of Rally's operation, 36,628 volunteers expressed interests via the site, submitting their contact information to nearly 700 unique study teams.

The burden of recruitment on resource-constrained study teams—without creating what some may feel is “another study listing” —is already high. This study assessed whether research teams would voluntarily use a web-based portal that requires them to go to the additional effort of writing content and getting separate approval for its use from an Institutional Review Board (IRB), and if so, what intrinsic and extrinsic factors determined their willingness to use and reuse this portal.


#

Methods

Survey Development and Implementation

A survey-based approach was adopted for evaluating researcher perceptions of Rally. The survey was based on constructs from various models of use of information technology. In the Technology Adoption Model (TAM), one of the widely used models, three factors were proposed to predict behavioral intention to use information technology: perceived ease of use, perceived usefulness, and attitudes.[13] [14] Another model, the Unified Theory of Adoption and Use of Technology (UTAUT) combined constructs from other models such as TAM, the Theory of Planned Behavior (TPB), and Bandura's social cognitive theory, in predicting behavioral intention to use information technology.[15] Beyond the predictors in TAM, UTAUT adds constructs such as: self-efficacy or confidence in using information technology; facilitating conditions; and voluntariness, whether the use of the information technology is voluntary or mandatory.

While these different models of adoption and use of information technology were developed in industries outside of health care, they have been applied to adoption of information technology in health care, particularly the adoption of Internet-based patient portals.[16] [17] [18] [19] Of the different constructs proposed in the literature, the most pertinent were applied to this study. Evaluated predictors included researcher perceptions of the ease of use and usefulness of Rally, facilitating conditions, and self-efficacy. As Rally was an institutionally funded program, there was specific interest in evaluating whether researchers perceived resources such as online support (a facilitating condition) to be important in their use of Rally.

In developing the items for the four domains selected, the study relied on existing items and scales from the UTAUT model but modified some wording to fit the context of the use of Rally. For example, perceived ease of use was captured through two items: ease of use of Rally for creating and/or editing a project; and ease of use of Rally for tracking the status of prospects as they move through the screening process. Both items were measured on a Likert scale (0—have not personally used, 1—very easy, 2—somewhat easy, 3—not easy). The dependent variable of interest was captured as researcher's intention to use Rally, measured through an affirmation to “continue to use Rally for another project in the future.” Appendix B lists the items and scales for the four domains and the dependent variable. In addition to the items on perceptions, researcher characteristics such as age, gender, and role were included.

Survey data were collected and managed using REDCap (Research Electronic Data Capture), which is a secure, web-based software platform designed to support data capture for research studies, providing (1) an intuitive interface for validated data capture; (2) audit trails for tracking data manipulation and export procedures; (3) automated export procedures for seamless data downloads to common statistical packages; and (4) procedures for data integration and interoperability with external sources.[20] [21]

A convenience sample of research coordinators was pilot tested to assess understandability of the items. This pilot did not result in any changes to the survey or its items, and invitation emails with a survey link were sent to all researchers who had used Rally. Researchers included principal investigators (PIs) and co-investigators, and research team members such as project managers, clinical research coordinators (CRCs), and research assistants (RAs). Three reminder emails with survey links were sent at intervals of approximately 1 week. Study duration was from January 10, 2019 through March 7, 2019. This research was exempted from review by the Rally Human Research Committee, the ethics board overseeing research at our institution.


#

Statistical Analysis

Rally is adopted and used at a research project level with research teams consisting of staff such as a PI, project managers, CRCs, and RAs. The survey unit of analysis was defined at the level of the project team. Four hundred twenty-seven unique projects comprised the survey population. For some projects, multiple survey responses were received from within the same research team, as for example, both the PI and CRC. To select a unique survey response within a project, early respondents of the survey were distinguished from late respondents. For example, CRC1 within a research team may have responded to the survey on January 11, 2019 (early respondent) and CRC2 within the same team may have responded to the survey on February 2, 2019 (late respondent). Responses from early respondents were selected for the analyses.

All items were recoded so that higher values reflected more positive perceptions (Appendix B). Proportions were computed for the variables capturing researcher characteristics and means for researcher perceptions of Rally. Chi-square was used to assess for significant differences in researcher characteristics. Hierarchical regression modeling was used to assess predictors of the dependent variable, behavioral intention (continue to use Rally in the future). Part of the regression modeling tested for assumptions of multicollinearity in the data.[22] A p-value of 0.05 was adopted to assess for significance of the results. IBM SPSS Statistical Software Version 24 was used for the analyses.


#
#

Results

There were 152 survey responses classified as early responses and 44 responses that were classified as late responses. The 152 early survey responses were selected for the study, yielding a response rate of 35.6% (152/427). [Table 1] shows the characteristics of all the survey respondents (early vs. late). No statistically significant difference was found between the two groups. Of the 152 survey respondents selected for the study, more than half were female (n = 120, 78.9%), were 29 years or younger (n = 99, 65.1%), had a role as a CRC or RA (n = 96, 63.2%), and had 2 years or less employment duration (n = 97, 63.8%).

Table 1

Characteristics of survey respondents

Early respondents (n = 152)

Late respondents (n = 44)

p-Value

Gender

0.93

 Female

120 (78.9%)

35 (79.5%)

 Male or other gender identity

32 (21.1%)

9 (20.5%)

Age

0.92

 ≤29 y

99 (65.1%)

29 (65.9%)

 30 y or greater

53 (34.9%)

15 (34.1%)

Role

0.37

 CRC or RA

96 (63.2%)

31 (70.5%)

 Project Manager or co-investigator or PI

56 (36.8%)

13 (29.5%)

Duration of employment

0.59

 2 y or less

97 (63.8%)

30 (68.2%)

 Greater than 2 y

55 (36.2%)

14 (31.8%)

Abbreviations: CRC, clinical research coordinator; RA, research assistant; PI, principal investigator.


[Table 2] shows bivariate associations of researcher characteristics with perceptions of Rally. Researcher characteristics were not related to perceptions with the exception of the association of duration of employment with facilitating conditions. For example, there was no difference in mean perceived ease of use for creating and editing a project on the portal between CRCs and RAs (mean = 2.51) versus project managers, co-investigators, and PIs (mean = 2.61, p = 0.25). Similarly, there was no difference in self-efficacy between these two groups (mean = 4.5 for CRCs and RAs versus mean = 4.4 for project managers, co-investigators, and PIs, p = 0.32).

Table 2

Association of researcher characteristics with perceptions[a]

Ease of use[b]

Usefulness

Facilitating conditions

Self-efficacy

Behavioral intention

Gender

Male or other gender identity

2.45

0.36

4.41

4.65

4.52

Female

2.56 (p = 0.35)

0.58 (p = 0.23)

4.23 (p = 0.43)

4.42 (p = 0.22)

4.57 (p = 0.71)

Age

≤29 y

2.52

0.51

4.21

4.49

4.52

30 y or greater

2.60 (p = 0.41)

0.58 (p = 0.64)

4.40 (p = 0.28)

4.35 (p = 0.35)

4.62 (p = 0.37)

Role

CRC or RA

2.51

0.48

4.24

4.5

4.52

Project Manager or Co-Investigator or PI

2.61 (p = 0.25)

0.63 (p = 0.30)

4.31 (p = 0.70)

4.4 (p = 0.32)

4.63 (p = 0.33)

Duration of employment

2 y or less

2.53

0.55

4.10

4.41

4.54

Greater than 2 y

2.56 (p = 0.74)

0.50 (p = 0.73)

4.59 (p = 0.004)

4.53 (p = 0.41)

4.57 (p = 0.85)

Abbreviations: CRC, clinical research coordinator; RA, research assistant; PI, principal investigator.


a Appendix B lists the perception items and scales adopted for the study.


b Item definitions are as follows: Ease of use: for each of the following items, please rate how easy it has been for you personally to use Rally; Usefulness: for each of the following items, please rate how useful Rally has been in helping you manage study recruitment; Facilitating conditions: thinking about your experience with Rally, please rate your agreement or disagreement with each of the following; Self-efficacy: I could complete most tasks on my own; Behavioral intention: I would use Rally to recruit for another project in the future.


[Table 3] shows results of the hierarchical regression model to assess predictors of behavioral intention to use Rally (captured as researcher evaluation that they will continue to use Rally in the future). The first step included two predictors: ease of use and usefulness. This model was not statistically significant. The second step introduced the other two predictors of interest: facilitating conditions and self-efficacy. This model was statistically significant (F (4.94) = 6.478; p < 0.001) and explained 21.6% of the variance in behavioral intention to use Rally in the future. The introduction of facilitating conditions and self-efficacy explained an additional 21.3% of variance in behavioral intention, after controlling for ease of use and usefulness (R 2 change = 0.213; F (2.94) = 12.788; p <0.001). In the final model, facilitating conditions was a stronger predictor of behavioral intention (β = 0.28, p < 0.05) than self-efficacy (β = 0.25, p <0.05).

Table 3

Hierarchical regression model to predict behavioral intention (“continue to use Rally in the future”)

R

R 2

R 2 change

β

Sig

Step 1

0.053

0.003

Ease of use

0.06

0.61

Usefulness

−0.03

0.78

Step 2

0.465

0.216

0.213[a]

Ease of use

0.04

0.72

Usefulness

−0.06

0.55

Self-efficacy

0.25

0.03[b]

Facilitating conditions

0.28

0.01[b]

a p <0.001.


b p <0.05.



#

Discussion

This study evaluated researcher perceptions of Rally for patient recruitment and tracking for clinical trials using constructs from models on adoption of information technology. Researcher characteristics such as gender, age, and role were not found to be related to perceptions of Rally and behavioral intention to use the portal. For example, there were no differences between the researcher group comprising CRCs and RAs compared with the researcher group comprising project managers, co-investigators, and PIs on perceptions such as ease of use and usefulness or behavioral intention to use the portal. A perceived barrier to recruitment posed by “professional management hierarchies that separate research recruitment work from research leadership” has been documented by Adams and colleagues.[23] An integrated online portal like Rally makes recruitment progress and challenges visible to the full research team, including senior members.

In terms of behavioral intention to use the online portal, neither ease of use nor usefulness predicted behavioral intention to use Rally in the future. One explanation is that in this specific study population, ease of use and usefulness do not play a role in behavioral intention. It is also possible that the re-wording of the items for these two constructs from a broad evaluation of use of the online portal to a more specific evaluation contributed to the lack of fit of the model in our study. For example, the item on ease of use asked researchers to evaluate how easy it was for them to personally use Rally for the specific task of creating and/or editing a project on the site. In comparison, a broad evaluation of ease of use of Rally would ask researchers at a global level how easy it was for them to use Rally. Additionally, given that researchers use multiple approaches to recruit patients for clinical trials beyond Rally it is possible that they are unable to evaluate Rally as a distinct recruitment strategy.

The study did find two predictors of behavioral intention: facilitating conditions and self-efficacy. When facilitating conditions, such as the availability of online resources when using the portal, are present then researchers are more likely to continue to use the portal in the future. Rosa and colleagues identified several challenges to the use of digital technologies in clinical trials including identification of best practices and infrastructure issues related to the use of digital technologies.[6] This study found that providing facilitating conditions to researchers of online portals for volunteer recruitment, such as online help, infrastructure support, and other types of researcher support, would positively influence the continued use of such portals. Self-efficacy in using the portal, captured through researcher perceptions that they can do most tasks on their own, predicted the continuing use of the portal in the future. If researchers are confident that they can accomplish tasks such as creating and editing a project, or tracking patients, through Rally, they will continue to use the portal in the future.

The study also found that less than a quarter of the variance in behavioral intention was explained by the predictors. This suggests that there are other factors that need to be considered in future research for predicting behavioral intention to use an online portal for volunteer recruitment. Attitudes, beliefs, and voluntariness of use of the portal may play important roles in predicting behavioral intention. Additional research is also needed to better understand why ease of use and usefulness did not emerge as predictors of behavioral intention to use the online portal; the use of global items rather than context-specific items may modify this relationship. Future research could document the set of best practices around enhancing facilitating conditions for researchers in the use of online portals for volunteer recruitment. Also, the use of an overall theoretical model that incorporates these factors such as the diffusion of innovation theory or the TPB could be explored as an extension of the research reported here.[24] [25] Contextual factors, such as specific clinical research domain, availability of local study populations, study participant burden, and alternative recruitment methods, are all likely to influence researcher intention to use a portal for volunteer recruitment. Systematic study of such factors will require studies of larger scope. This research can be considered an initial step toward a broader understanding of researcher use of web portals for clinical research requiring researcher-authored content such as the one used in this study as well as portals that repurpose registry content for recruitment of volunteers into clinical trials.[26] [27]


#

Limitations

This study was conducted in the setting of one healthcare system located in the northeast United States, and its findings may not be generalizable to other systems and locations. The survey response rate was low, and nonresponders may have different perceptions of Rally from responders. It also did not capture the race of the researcher as part of the sociodemographic characteristics although most researchers in our institution are Caucasian. Additionally, it did not capture researcher perceptions of the relative value of Rally compared with other recruitment methods used (e.g., in-office recruitment) or the use of social media (e.g., Facebook). The study was also not large enough to capture differences that might be attributable to variations in study type. Other factors such as attitudes, beliefs, and voluntariness of use of the portal which may play important roles in predicting behavioral intention were not included in the study. This study focused on researcher perceptions, but research focusing on volunteer perceptions of online portals for clinical trial recruitment is equally important.[28]


#

Conclusion

In this study, constructs from theoretical models of use of information technology were applied to evaluate researcher perceptions of an online portal for volunteer recruitment for clinical trials and predict behavioral intention to use the portal. The study found that facilitating conditions and self-efficacy predicted behavioral intention to use the portal. Ease of use and usefulness of the portal were not identified as predictors. From a policy perspective, a “digital divide” in the use of Internet-based portals for clinical trial recruitment between research assistants and PIs was not found; thus, use of such portals may help address the barrier to recruitment of “professional management hierarchies that separate research recruitment work from research leadership.” Both the high cost of clinical trial recruitment and the evolving variety of approaches used in recruitment argue for further development of formal methods of assessment of the researcher perspective on the adoption and use of digital solutions for clinical trial recruitment.


#

Clinical Relevance Statement

Recruitment of subjects is a major challenge faced by many clinical trials. To address recruitment challenges, Internet-based portals are increasingly being applied to facilitate clinical trial recruitment. This study found that providing facilitating conditions for researchers to use Internet portals for clinical trial recruitment, such as online help, would positively influence continued use of such portals.


#
Appendix A Screen shot of Rally
Zoom Image
Appendix B Items and Scales for User Perceptions of an Online Portal for Clinical Trial Recruitment
  1. Ease of use

    For each of the following items, please rate how easy it has been for you personally to use Rally (0 = have not personally used, 1 = very easy, 2 = somewhat easy, 3 = not easy)

    • 1.1. Creating and/or editing a project

    • 1.2. Tracking the status of prospects as they move through the screening process.

      • Effort expectancy = average of items 1.1 and 1.2 after recoding original items as: 0 = missing, 1 = not easy, 2 = somewhat easy, 3 = very easy.

  2. Usefulness

    For each of the following items, please rate how useful Rally has been in helping you manage study recruitment (1 = very useful, 2 = somewhat useful, 3 = not useful, 4 = no opinion).

    • 2.1. Tracking the status of prospects as they move through the screening process.

    • 2.2. Recording recruitment notes about contact with prospects.

      • Performance expectancy = Average of items 2.1 and 2.2 after recoding original items as: (−1 = not useful, 0 = no opinion, 1 = somewhat useful, 2 = very useful)

  3. Facilitating conditions

    Thinking about your experience with Rally, please rate your agreement or disagreement with each of the following (1 = strongly agree, 2 = agree, 3 = neutral, 4 = disagree, 5 = strongly disagree, 6 = not applicable):

    • 3.1. I could complete most tasks with just the online help resources and/or the Rally webinars for assistance.

    • 3.2. I could complete most tasks with Rally by requesting help if I got stuck.

      • Facilitating conditions = Average of items 3.1 and 3.2 after recoding original items as: (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly disagree, 6 = missing)

  4. Self-efficacy

    Thinking about your experience with Rally, please rate your agreement or disagreement with each of the following (1 = strongly agree, 2 = agree, 3 = neutral, 4 = disagree, 5 = strongly disagree, 6 = not applicable):

    • 4.1. I could complete most tasks on my own.

      Item recoded as: (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly disagree, 6 = missing)

  5. Behavioral intention

    • I would use Rally to recruit for another project in the future: (1 = strongly agree, 2 = agree, 3 = neutral, 4 = disagree, 5 = strongly disagree, 6 = not applicable).

    • Item recoded as: (1 = strongly disagree, 2 =disagree, 3 = neutral, 4 = agree, 5 = strongly disagree, 6 = missing)


#

Conflict of Interest

None declared.

Protection of Human and Animal Subjects

This research was exempted from review by the Mass General Brigham IRB, the ethics board overseeing research at our institution.


Author Contributions

We confirm that the manuscript has been read and approved by all the named authors and that there are no other persons who satisfied the criteria for authorship but are not listed. We further confirm that the order of authors listed in the manuscript has been approved by all of us.


  • References

  • 1 McDonald AM, Treweek S, Shakur H. et al. Using a business model approach and marketing techniques for recruitment to clinical trials. Trials 2011; 12 (01) 74
  • 2 Huang GD, Bull J, Johnston McKee K, Mahon E, Harper B, Roberts JN. CTTI Recruitment Project Team. Clinical trials recruitment planning: a proposed framework from the Clinical Trials Transformation Initiative. Contemp Clin Trials 2018; 66 (66) 74-79
  • 3 Bower P, Brueton V, Gamble C. et al. Interventions to improve recruitment and retention in clinical trials: a survey and workshop to assess current practice and future priorities. Trials 2014; 15 (01) 399
  • 4 Davis JM, Sandgren AJ, Manley AR, Daleo MA, Smith SS. Optimizing clinical trial enrollment methods through “goal programming.”. Appl Clin Trials 2014; 23 (6-7): 46-50
  • 5 Pew Research Center. Technology adoption climbs among older adults. Accessed November 17, 2020 at: https://www.pewinternet.org/2017/05/17/tech-adoption-climbs-among-older-adults/
  • 6 Rosa C, Campbell ANC, Miele GM, Brunner M, Winstanley EL. Using e-technologies in clinical trials. Contemp Clin Trials 2015; 45 (Pt A): 41-54
  • 7 Accessed November 17, 2020 at: https://clinicaltrials.gov/
  • 8 Zarin DA, Tse T, Williams RJ, Carr S. Trial reporting in ClinicalTrials.gov-the final rule. N Engl J Med 2016; 375 (20) 1998-2004
  • 9 Trials Today. Accessed November 17, 2020 at: https://www.trialstoday.org/
  • 10 Clinical and Translational Science Institute. StudyFinder tool continues national expansion enabling public to explore studies at academic research institutions. University of Minnesota; Accessed November 17, 2020 at: https://www.ctsi.umn.edu/news-and-events/news/ctsi%E2%80%99s-studyfinder-tool-continues-national-expansion-enabling-public-explore-studies-academic-research-institutions
  • 11 Wu DT, Hanauer DA, Mei Q. et al. Assessing the readability of ClinicalTrials.gov. J Am Med Inform Assoc 2016; 23 (02) 269-275
  • 12 Williams RJ, Woloshin S, Gentile A, Rosenfeld SJ, Morgan SA. Website functionality panel. Accessed December 16, 2020 at: https://prsinfo.clinicaltrials.gov/modernization/WebsiteFunctionalityPanel.pdf
  • 13 Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. Manage Inf Syst Q 1989; 13 (03) 319-340
  • 14 Davis FD, Bagozzi RP, Warshaw PR. User acceptance of computer technology: a comparison of two theoretical models. Manage Sci 1989; 35 (08) 982-1003
  • 15 Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. Manage Inf Syst Q 2003; 27: 425-478
  • 16 Holden RJ, Karsh B-T. The technology acceptance model: its past and its future in health care. J Biomed Inform 2010; 43 (01) 159-172
  • 17 Tavares J, Oliveira T. Electronic health record patient portal adoption by health care consumers: an acceptance model and survey. J Med Internet Res 2016; 18 (03) e49
  • 18 Emani S, Healey M, Ting DY. et al. Awareness and use of the after-visit summary through a patient portal: Evaluation of patient characteristics and an application of the Theory of Planned Behavior. J Med Internet Res 2016; 18 (04) e77
  • 19 Emani S, Peters E, Desai S. et al. Perceptions of adopters versus non-adopters of a patient portal: an application of diffusion of innovation theory. JHI 2018; 25 (03) 149
  • 20 Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009; 42 (02) 377-381
  • 21 Harris PA, Taylor R, Minor BL. et al; REDCap Consortium. The REDCap consortium: Building an international community of software platform partners. J Biomed Inform 2019; 95: 103208
  • 22 Field A. Discovering Statistics Using SPSS. 2nd ed.. London: Sage Publications Ltd; 2005
  • 23 Adams M, Caffrey L, McKevitt C. Barriers and opportunities for enhancing patient recruitment and retention in clinical research: findings from an interview study in an NHS academic health science centre. Health Res Policy Syst 2015; 13 (01) 8
  • 24 Rogers EM. Diffusion of Innovations. 5th ed.. New York, NY: Free Press; 2003
  • 25 Fishbein M, Ajzen I. Predicting and Changing Behavior: The Reasoned Action Approach. New York: Psychology Press; 2010
  • 26 Harris PA, Scott KW, Lebo L, Hassan N, Lightner C, Pulley J. ResearchMatch: a national registry to recruit volunteers for clinical research. Acad Med 2012; 87 (01) 66-73
  • 27 Pulley JM, Jerome RN, Bernard GR. et al. Connecting the public with clinical trial options: the ResearchMatch Trials Today tool. J Clin Transl Sci 2018; 2 (04) 253-257
  • 28 Tabriz AA, Fleming PJ, Shin Y. et al. Challenges and opportunities using online portals to recruit diverse patients to behavioral trials. J Am Med Inform Assoc 2019; 26 (12) 1637-1644

Address for correspondence

Yichuan Grace Hsieh, PhD, RN
Laboratory of Computer Science, Massachusetts General Hospital
50 Staniford ST, Suite 750, Boston, MA 02114
United States   

Publication History

Received: 14 July 2020

Accepted: 03 June 2021

Article published online:
15 September 2021

© 2021. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution License, permitting unrestricted use, distribution, and reproduction so long as the original work is properly cited. (https://creativecommons.org/licenses/by/4.0/)

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 McDonald AM, Treweek S, Shakur H. et al. Using a business model approach and marketing techniques for recruitment to clinical trials. Trials 2011; 12 (01) 74
  • 2 Huang GD, Bull J, Johnston McKee K, Mahon E, Harper B, Roberts JN. CTTI Recruitment Project Team. Clinical trials recruitment planning: a proposed framework from the Clinical Trials Transformation Initiative. Contemp Clin Trials 2018; 66 (66) 74-79
  • 3 Bower P, Brueton V, Gamble C. et al. Interventions to improve recruitment and retention in clinical trials: a survey and workshop to assess current practice and future priorities. Trials 2014; 15 (01) 399
  • 4 Davis JM, Sandgren AJ, Manley AR, Daleo MA, Smith SS. Optimizing clinical trial enrollment methods through “goal programming.”. Appl Clin Trials 2014; 23 (6-7): 46-50
  • 5 Pew Research Center. Technology adoption climbs among older adults. Accessed November 17, 2020 at: https://www.pewinternet.org/2017/05/17/tech-adoption-climbs-among-older-adults/
  • 6 Rosa C, Campbell ANC, Miele GM, Brunner M, Winstanley EL. Using e-technologies in clinical trials. Contemp Clin Trials 2015; 45 (Pt A): 41-54
  • 7 Accessed November 17, 2020 at: https://clinicaltrials.gov/
  • 8 Zarin DA, Tse T, Williams RJ, Carr S. Trial reporting in ClinicalTrials.gov-the final rule. N Engl J Med 2016; 375 (20) 1998-2004
  • 9 Trials Today. Accessed November 17, 2020 at: https://www.trialstoday.org/
  • 10 Clinical and Translational Science Institute. StudyFinder tool continues national expansion enabling public to explore studies at academic research institutions. University of Minnesota; Accessed November 17, 2020 at: https://www.ctsi.umn.edu/news-and-events/news/ctsi%E2%80%99s-studyfinder-tool-continues-national-expansion-enabling-public-explore-studies-academic-research-institutions
  • 11 Wu DT, Hanauer DA, Mei Q. et al. Assessing the readability of ClinicalTrials.gov. J Am Med Inform Assoc 2016; 23 (02) 269-275
  • 12 Williams RJ, Woloshin S, Gentile A, Rosenfeld SJ, Morgan SA. Website functionality panel. Accessed December 16, 2020 at: https://prsinfo.clinicaltrials.gov/modernization/WebsiteFunctionalityPanel.pdf
  • 13 Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. Manage Inf Syst Q 1989; 13 (03) 319-340
  • 14 Davis FD, Bagozzi RP, Warshaw PR. User acceptance of computer technology: a comparison of two theoretical models. Manage Sci 1989; 35 (08) 982-1003
  • 15 Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. Manage Inf Syst Q 2003; 27: 425-478
  • 16 Holden RJ, Karsh B-T. The technology acceptance model: its past and its future in health care. J Biomed Inform 2010; 43 (01) 159-172
  • 17 Tavares J, Oliveira T. Electronic health record patient portal adoption by health care consumers: an acceptance model and survey. J Med Internet Res 2016; 18 (03) e49
  • 18 Emani S, Healey M, Ting DY. et al. Awareness and use of the after-visit summary through a patient portal: Evaluation of patient characteristics and an application of the Theory of Planned Behavior. J Med Internet Res 2016; 18 (04) e77
  • 19 Emani S, Peters E, Desai S. et al. Perceptions of adopters versus non-adopters of a patient portal: an application of diffusion of innovation theory. JHI 2018; 25 (03) 149
  • 20 Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009; 42 (02) 377-381
  • 21 Harris PA, Taylor R, Minor BL. et al; REDCap Consortium. The REDCap consortium: Building an international community of software platform partners. J Biomed Inform 2019; 95: 103208
  • 22 Field A. Discovering Statistics Using SPSS. 2nd ed.. London: Sage Publications Ltd; 2005
  • 23 Adams M, Caffrey L, McKevitt C. Barriers and opportunities for enhancing patient recruitment and retention in clinical research: findings from an interview study in an NHS academic health science centre. Health Res Policy Syst 2015; 13 (01) 8
  • 24 Rogers EM. Diffusion of Innovations. 5th ed.. New York, NY: Free Press; 2003
  • 25 Fishbein M, Ajzen I. Predicting and Changing Behavior: The Reasoned Action Approach. New York: Psychology Press; 2010
  • 26 Harris PA, Scott KW, Lebo L, Hassan N, Lightner C, Pulley J. ResearchMatch: a national registry to recruit volunteers for clinical research. Acad Med 2012; 87 (01) 66-73
  • 27 Pulley JM, Jerome RN, Bernard GR. et al. Connecting the public with clinical trial options: the ResearchMatch Trials Today tool. J Clin Transl Sci 2018; 2 (04) 253-257
  • 28 Tabriz AA, Fleming PJ, Shin Y. et al. Challenges and opportunities using online portals to recruit diverse patients to behavioral trials. J Am Med Inform Assoc 2019; 26 (12) 1637-1644

Zoom Image