Development of a Mobile Tool That Semiautomatically Screens Patients for Stroke Clinical Trials
Background and Purpose—Despite several national coordinated research networks, enrollment in many cerebrovascular trials remains challenging. An electronic tool was needed that would improve the efficiency and efficacy of screening for multiple simultaneous acute clinical stroke trials by automating the evaluation of inclusion and exclusion criteria, improving screening procedures and streamlining the communication process between the stroke research coordinators and the stroke clinicians.
Methods—A multidisciplinary group consisting of physicians, study coordinators, and biostatisticians designed and developed an electronic clinical trial screening tool on a HIPAA (Health Insurance Portability and Accountability Act)-compliant platform.
Results—A web-based tool was developed that uses branch logic to determine eligibility for simultaneously enrolling clinical trials and automatically notifies the study coordinator teams about eligible patients. After 12 weeks of use, 225 surveys were completed, and 51 patients were enrolled in acute stroke clinical trials. Compared with the 12 weeks before implementation of the tool, there was an increase in enrollment from 16.5% of patients screened to 23.4% of patients screened (P<0.05). Clinicians and coordinators reported increased satisfaction with the process and improved ease of screening.
Conclusions—We created a semiautomated electronic screening tool that uses branch logic to screen patients for stroke clinical trials. The tool has improved efficiency and efficacy of screening, and it could be adapted for use at other sites and in other medical fields.
Enrollment in clinical trials is a widespread challenge. Up to 60% of randomized clinical trials fail to reach target enrollment or require extension of the enrollment period.1,2 The consequences of slow enrollment include financial costs, delays in applying effective interventions, and increasing participant exposure time to an ineffective therapy.3,4 Clinical trial enrollment in acute stroke is particularly challenging, given the critical time-sensitivity of the interventions.5 The NIH (National Institute of Health) NINDS (National Institute of Neurological Disorders and Stroke) Stroke Program Review Group reported that one of the main limitations in timely completion of stroke clinical trials is poor enrollment.5,6 The NIH has formed clinical research networks7,8 to help facilitate stroke clinical trial enrollment. Despite this, enrollment in many cerebrovascular trials remains challenging. At centers participating in acute stroke research, the volume of potential trial candidates and the number of actively recruiting trials may be high, and the methodology for identifying eligible patients is complex.
At Stanford University Medical Center, we have numerous active stroke clinical trials. Most trials require timely acute enrollment, and patients must be efficiently and expeditiously screened. To help address the challenge of timely screening in trials with complex eligibility criteria, increasing numbers of stroke trials have created their own mobile applications to identify eligible patients.9 Although these may aid in screening for a single trial, they do not address screening for multiple simultaneous trials. Further complicating the screening and enrollment process is the complexity of unique inclusion/exclusion criteria in each trial. Parallel screening by research coordinators and clinicians and limited prioritization of research screening by clinicians during busy clinical times were also challenges. The process limited timely enrollments.
An innovative tool was needed to automate the evaluation of inclusion/exclusion criteria, improve screening procedures, and streamline communication between research coordinators and clinicians. The tool had to be simple, user-friendly, accessible via mobile platform, provide timely feedback to the research coordinators, and securely transfer protected health information as per the Health Insurance Portability and Accountability Act.10 The objective was to improve the screening process for acute stroke trials.
We convened a multidisciplinary group, including stroke faculty and fellows, research coordinators, and biostatisticians. A smartphone application was not pursued because of inability to update the application regularly and lack of access on nonsmart phone platforms. We used Stanford’s Research Electronic Data Capture (REDCap) tool, a platform designed to support clinical and translational research,11 because it is institutionally approved for protected health information, HIPAA (Health Insurance Portability and Accountability Act)-complaint, and free to our researchers as part of Stanford’s Clinical and Translational Science Award infrastructure.
Initial feasibility limitations included the many layers of authentication required to open our eligibility forms in REDCap. By converting our eligibility forms into REDCap surveys, accessible via a weblink from any internet browser, we bypassed the access problems and created a secure, compliant data collection mechanism. Clinicians can access the weblink via cellular or Wi-Fi networks. On completion of each survey, REDCap automatically generates a secure e-mail to the study coordinators, who then follow a hyperlink to review the screening results. Additionally, the primary investigator(s) for particular time-sensitive trials receive automated e-mail notifications if a patient screens eligible for their trial.
The screening survey uses conditional logic (known as branching logic in REDCap) to sort the applicable trial eligibility choices (Figure 1). The clinical team enters basic demographic information (name, medical record number [optional]) and stroke type (3 choices: ischemic [stroke or transient ischemic attack], hemorrhagic, not stroke). If ischemic stroke is chosen, these fields are collected: NIH stroke scale (free text field with integer limits of 0–42) and the time since last-known-well (3 choices: <12 hours, 12–24 hours, and >24 hours). If hemorrhagic stroke is chosen, these fields are collected: NIH stroke scale, the time since last-known-well (2 choices: <24 hours and >24 hours), and optionally, the Glasgow Coma Scale (a binary choice of either ≥5 or <5). Based on this information, the tool presents a list of trials for which the patient may be eligible, along with a brief description of each trial and inclusion criteria (Figure 2). The clinician then selects Eligible, Not Eligible, or Not Eligible Now-Continue Screening for each presented study. A free text box allows for additional notes.
The process of screening and notification regarding stroke patients is described in detail in the online-only Data Supplement. After implementation of the screening tool, the fellows continued to hear about patients in the same ways, but completed a survey within 30 minutes of notification about a patient. Additional screening and enrollment was then completed by the research coordinators, though all investigators were able to consent and enroll patients if coordinators were not available.
After 12 weeks of use, a survey was administered to the study coordinators and stroke fellows to assess their impressions of the tool and the screening and enrollment process compared with the 12 weeks before implementation (Figure 3). Additionally, we compared the number of patients screened and enrolled in the 12 weeks before and after implementation. The 12-week comparison period was chosen to minimize differences in personnel and active trials. Survey responses were described as median (interquartile range), and enrollment numbers were compared by χ2 test.
After 12 weeks of use, 225 surveys were completed, and 51 patients were enrolled in stroke clinical trials. Compared with the previous 12 weeks of screening, fewer patients were screened but a higher percentage of patients were enrolled (16.5% before versus 23.4% after implementation; P<0.05). Research coordinators reported a decrease in time devoted to screening and improved communication with the clinical team (Figure 3). Clinicians reported increased satisfaction with the process, improved ease of screening, fewer disruptions to clinical workflow to answer screening-related questions, and overall better communication among multiple research teams (Figure 3).
We created a semiautomated electronic screening tool that uses branch logic to screen patients for stroke clinical trials and automatically notifies the research team about eligible subjects. Since implementation, the tool has improved screening efficiency and efficacy. The clinical and research teams report improved satisfaction with the process.
Additional benefits of the REDCap platform include the ability to quickly update the instrument to reflect changes in trials, inclusion/exclusion criteria, and personnel. Details about the survey tool editing process are described in the online-only Data Supplement. The tool also provides an automatic record of all patients screened, which is helpful for clinical metrics and research screening logs. The REDCap platform is only available to subscribing institutions, which may limit the broader accessibility. However, the outline and logic could be adapted and implemented in other Web-based applications. Additional challenges include the need to still have some screening by the clinicians because all inclusion and exclusion criteria for each trial are not included. Future directions include adding a similar branch logic algorithm for nonacute trials and developing a tool for research coordinators that includes all inclusion and exclusion criteria. We are also exploring how to integrate the current tool with the electronic medical record, so that survey data could be auto-populated on any stroke patient.
The field of stroke is rapidly evolving, driven by a large number of clinical trials. Many clinical centers face the challenges of enrolling patients in time-sensitive acute trials and screening for multiple simultaneous studies. A semiautomatic web-based product that uses branch logic has greatly improved clinical trial screening. The tool can likely be adapted to local sites and other medical fields and implemented on different platforms.
Sources of Funding
The Stanford Clinical and Translational Science Award is supported by National Institutes of Health UL1 RR025744.
The online-only Data Supplement is available with this article at http://stroke.ahajournals.org/lookup/suppl/doi:10.1161/STROKEAHA.116.013456/-/DC1.
- Received March 15, 2016.
- Revision received June 22, 2016.
- Accepted June 28, 2016.
- © 2016 American Heart Association, Inc.
- Puffer S, Torgerson D
- Qureshi AI,
- Tariq N,
- Vazquez G,
- Novitzke J,
- Suri MF,
- Lakshminarayan K,
- et al
- Grotta JC,
- Jacobs TP,
- Koroshetz WJ,
- Moskowitz MA
- 7.↵NETT. Neurologic Emergencies Treatment Trials website. http://nett.umich.edu/. Accessed March 3, 2016.
- 8.↵National Institutes of Health StrokeNet website. http://www.nihstrokenet.org. Accessed March 3, 2016.
- Harris PA,
- Taylor R,
- Thielke R,
- Payne J,
- Gonzalez N,
- Conde JG