Benchmarks and Determinants of Adherence to Stroke Performance Measures
Background and Purpose— Develop achievable benchmarks for 9 stroke performance measures (PM) and to identify organizational factors associated with adherence.
Methods— Adherence rates and achievable benchmarks were determined for 9 PM within a study of patients (n=2294) admitted with acute ischemic stroke at 17 hospitals. Baseline information regarding hospital characteristics and stroke-specific processes of care were collected, and multi-level models were used to test the association of these factors with adherence.
Results— Benchmarks were ≥90% for 8 of the 9 PM. After controlling for clustering, only use of standing orders was associated with adherence to PM, including: dysphagia screening, venous thrombosis prophylaxis, consideration of tPA, and provision of educational material.
Conclusion— High levels of adherence are achievable for several acute stroke PM. Use of standing orders is associated with adherence to PM requiring immediate action on admission.
- delivery of health care
- practice improvement
- quality of health care
- process assessment (health care)
- cerebral infarction
There are several projects that assess in-hospital ischemic stroke quality of care. Most show wide state-to-state, region-to-region, and country-to-country practice variation for adherence to guideline recommendations.1,2 The provision of performance measure (PM) benchmarks has been shown in cardiac care to improve adherence more than audit and feedback alone.3 Currently, there are few nationally published benchmarks for the care of patients with ischemic stroke that show what is achievable in routine care.
The Stroke Practice Improvement Network (SPIN) was a multi-centered, prospective longitudinal study designed to develop and test strategies to improve the quality of in-hospital stroke care. Here, we report on the baseline data used to derive achievable benchmarks for 9 in-hospital PM and to determine which organizational and process of care factors is associated with high or low adherence to these measures.
Seventeen volunteer hospitals, including 13 community, in 9 states were chosen to participate. These sites had to financially sustain their own study coordinator.
The highest rated PM developed by a multidisciplinary expert panel were provided to the study sites.4 Sites rated these for relevance, room for improvement, the potential for developing a quality improvement (QI) activity for the measure, and overall desire to include the measure in this project. Based on site ratings, 4 PM were required: door to needle time of ≤1 hour for tPA (tPA1), screening for dysphagia (dysphagia), prophylaxis for deep vein thrombosis (DVT), and warfarin for atrial fibrillation (afib). An additional 5 PM were optional: discharge on antithrombotics (thrombotics), tPA considered (tPA), etiology documented (etiology), smoking assessed, and counsel (smoke) given and stroke education and resources given (ed).
For data validation purposes, comparison of entered data was made to independent chart review for a randomly selected sample comprising 10 percent of the total patient enrollment. Reliability of the data were good overall (kappa=0.68).
Achievable benchmarks of care were developed.5 The benchmarks represent the average performance for the top 10% of the sites adjusted for differences in the numbers of patients at each site. The baseline data collection period ran from December 3, 2001 through December 4, 2002.
Separate analyses were performed for each PM. Logistic regression was used to calculate odds ratios (OR) and the corresponding 95% confidence intervals (CI). Controlling for clustering within hospitals was performed using the generalized linear mixed model. Variables analyzed included hospital characteristics which we thought a priori would be related to adherence rates: size, teaching status, QI infrastructure, presence of a stroke unit, pathways, team, and standing orders.6 Standing orders were assessed at the patient level.
Data on 2294 cases was collected by 17 sites. The average age of those enrolled was 71, 50% were female, 82% white, and 9.5% black. The overall inpatient mortality rate was 4.9%. Thirteen sites collected all 9 PM, and 4 sites collected 6 of the nine PM.
Of note, there was substantial colinearity of the hospital level predictor variables, which precluded their separate analysis in the multivariable models. All hospitals with stroke units had stroke pathways, and 80% had stroke teams. Teaching hospitals, those with stroke units, and those with stroke pathways identified largely the same hospitals.
Table 1 shows the adherence rates for each of the 9 PMs. Univariate analysis showed that many of the hospital level variables (pathways, hospital type, QI infrastructure) were associated with adherence to at least 1 PM. On multivariate analysis, pathways and presence of a stroke team were associated with increased dysphagia, tPA, and TPA1 adherence. QI infrastructure was associated with increased provision of Education.
After controlling for the effect of clustering of patients, only use of standing orders remained significantly associated with adherence to any of the PM (Table 2).
Achievable benchmarks are reported for 9 aspects of performance of hospital stroke care. These PM were not meant to quantify the quality of care of the hospital but to include different aspects of acute stroke care: hyper-acute care, in-hospital care, and discharge care.
Except for 1 measure related to the delivery of thrombolytics, it is possible in the clinical setting to adhere to these processes of care measures 90% of the time or better. Although our overall average adherence to similar PM is slightly higher than that seen in other countries, the upper range of adherence rates appear similar to ours suggesting that the benchmarks may be similar, even if the average is lower.1,2
We found no one component or tool that was associated with high adherence to all measures. The lack of association between hospital size or academic status suggests that these PM can be implemented at any hospital and are patient related and disease specific. The importance of using standing orders was demonstrated for many care processes that require action immediately on admission (for the prevention of complications); this has also been found in the cardiac and stroke literature.7,8 It has been proposed that the use of standing orders in European stroke units may explain the associated lower mortality and dependency rates. We did not have sufficient power to detect a difference between stroke units and use of standing orders.
Our study has several limitations. The lack of association between hospital level characteristics and adherence rates may be attributable to the low power to detect a difference within 17 hospitals. Additionally, sites were self-selected. But, as most of our hospitals were community, this should reflect what can be attained by other community hospitals willing to put resources into QI. Lastly, our sites choose which PM to work on, and whether they would have achieved similar results on measures forced on hospitals is unknown. At the time of this study, Medicare was testing 2 of the required measures and probably account for their high rating. This may suggest that other measures, if mandated by a national or regional association, will also lead to high improvement, but more research is needed into what factors motivate practitioners or organizations to change. Finally, gaps in extant research will need to be addressed in future studies to facilitate improved adherence to standards of care for the acute stroke patient.
See supplemental Table I, available online at http://stroke.ahajournals.org.
Sources of Funding
Dr Hinchey was supported by grant No. K23 NS02163 from the National Institute of Neurological Disorders and Stroke (NINDS). Dr Kent was supported by grant No. K23 NS044929 from NINDS. This project was funded by the American Academy of Neurology, the American Heart Association, and an unrestricted educational grant from Boehringer Ingelheim Pharmaceuticals, Inc.
- Received September 7, 2007.
- Accepted September 24, 2007.
Cadilhac DA, Ibrahim J, Pearce DC, Ogden KJ, McNeill J, Davis SM, Donnan GA; for the SCOPES Study group. Stroke. 2004; 35: 1035–1040.
Heuschmann PU, Biegler MK, Busse O, Elsner S, Grau A, Hasenbein U, Hermanek P, Janzen RWC, Kolominisky-Rabas PL, Kraywinkel K. Lowitzsch K, Misselwitz B, Nabavi DG, Otten K, Pientka L, von Reutern GM, Ringelstein EB, Sander D, Wagner M, Berger K. Development and implementation of evidence-based indicators for measuring quality of acute stroke care. The quality indictor board of the German stroke registers study group (ADSR). Stroke. 2006; 37: 2573–2578.
Holloway RG, Vickrey BG, Benesch C, Hinchey JA, Bieber J. Development of performance measures for acute ischemic stroke. Stroke. 2001; 32: 2058–2074.
Kiefe CI, Weissman NW, Allison JJ, Farmer R, Weaver M, Williams OD. Identifying achievable benchmarks of care (ABCs): concepts and methodology. Int J Qual Health Care. 1998; 10: 443–447.
Wagner C, De Baaker DH, Groenewegen PP. A measuring instrument for evaluation of quality systems. Int J Quality Health Care. 1999; 11: 119–130.
California Acute Stroke pilot Registry (CASPR) Investigators. The impact of standardized stroke orders on adherence to best practices. Neurology. 2005; 65: 360–365.