Variation in Do-Not-Resuscitate Orders for Patients With Ischemic Stroke
Implications for National Hospital Comparisons
Background and Purpose—Decisions on life-sustaining treatments and the use of do-not-resuscitate (DNR) orders can affect early mortality after stroke. We investigated the variation in early DNR use after stroke among hospitals in California and the effect of this variation on mortality-based hospital classifications.
Methods—Using the California State Inpatient Database from 2005 to 2011, ischemic stroke admissions for patients aged ≥50 years were identified. Cases were categorized by the presence or the absence of DNR orders within the first 24 hours of admission. Multilevel logistic regression models with a random hospital intercept were used to predict inpatient mortality after adjusting for comorbidities, vascular risk factors, and demographics. Hospital mortality rank order was assigned based on this model and compared with the results of a second model that included DNR status.
Results—From 355 hospitals, 252 368 cases were identified, including 33 672 (13.3%) with early DNR. Hospital-level–adjusted use of DNR varied widely (quintile 1, 2.2% versus quintile 5, 23.2%). Hospitals with higher early DNR use had higher inpatient mortality because inpatient mortality more than doubled from quintile 1 (4.2%) to quintile 5 (8.7%). Failure to adjust for DNR orders resulted in substantial hospital reclassification across the rank spectrum, including among high mortality hospitals.
Conclusions—There is wide variation in the hospital-level proportion of ischemic stroke patients with early DNR orders; this variation affects hospital mortality estimates. Unless the circumstances of early DNR orders are better understood, mortality-based hospital comparisons may not reliably identify hospitals providing a lower quality of care.
Early mortality after ischemic stroke is currently used as a publicly reported measure of hospital-based quality of care by some reporting systems,1 and a 30-day risk-adjusted mortality model for stroke is under development by the Center for Medicare and Medicaid Services (CMS). The justification behind the CMS model and others is that higher mortality is indicative of lower quality of care. This rationale may fail to account for variation in patient preferences adequately because most early deaths after ischemic stroke are related to patient or family preferences to withhold or withdraw potential life-sustaining interventions, such as artificial hydration and nutrition or mechanical ventilation.2,3
Do-not-resuscitate (DNR) orders placed early in an ischemic stroke hospitalization are associated with a higher risk of mortality after stroke.4,5 Early DNR orders could be reflective of excessive physician pessimism on possible outcomes after severe stroke, and in this setting the quality of care may in fact be suboptimal. In contrast, early DNR orders (and other limitations on intensity of treatment) could also be indicative of pre-existing patient or family preferences for limitations of life-sustaining treatment or for prioritizing comfort over length of life when facing possibility disability. In this scenario, the use of DNR orders and other limitations in life-sustaining treatment would reflect appropriate matching of treatment plan to patient goals and would be considered high-quality, patient-centered care. Because the exact circumstances in which DNR orders are used in individual patients with stroke are not available in the administrative data sets used for hospital mortality measures (such as the proposed CMS measure), it is unclear whether an important predictor of early stroke mortality (DNR order use) is associated with higher or lower quality of care.
In this study, we investigated the variation in early DNR use at the hospital level and the extent to which a proposed metric of hospital quality (inpatient stroke mortality) is affected by accounting for differences in early DNR use.
We performed a study of all inpatient discharges for the state of California from 2005 to 2011 from the State Inpatient Database Healthcare Cost and Utilization Project, Agency for Healthcare Research and Quality. State Inpatient Database captures all inpatient discharges within a given year. California was selected for this analysis because of its large population, large number of hospitals, and reporting of whether early DNR orders (within the first 24 hours of admission) were present.
Inclusion and Exclusion Criteria
Hospital visits were included in this study if patients aged ≥50 years, and the primary discharge diagnosis was an ischemic stroke diagnosis (International Classification of Disease, Ninth Revision, CM 433.x1, 434.x1, 436). This algorithm has a positive predictive value of 88% and sensitivity of 74% for identifying ischemic stroke.6,7 Hospitals were excluded if they had an average DNR proportion >50% and an in-hospital case fatality >80% because these facilities were inferred to be hospices (5 facilities and 212 total discharges). We also excluded low-volume facilities with <10 total ischemic stroke hospitalizations during the study period (16 facilities and 49 total discharges).
Primary Outcome and Covariate Definitions
The primary outcome was whether a patient died during their hospitalization. The primary exposure variable was whether an order to limit resuscitation efforts was written within the first 24 hours of admission (early DNR).8,9 The database does not differentiate between DNR orders that are newly placed during the hospitalization and those in place at the time of admission, and, therefore, both of these scenarios would be coded as early DNR. A small number of individuals were excluded (0.02% of the sample) because no data were recorded on whether an early DNR order was present. Comorbidities were determined using the components of the modified Charlson definition and life-sustaining treatments based on standardized definitions.10,11 Vascular risk factors were defined using the Healthcare Cost and Utilization Project single-level clinical classification system.12 Whether a patient had a previous ischemic stroke, intracerebral hemorrhage (International Classification of Disease, Ninth Revision, CM 431.xx), or transient ischemic attack (435.xx) at any time during the study period was determined by linking visit-level data using Healthcare Cost and Utilization Project revisit variables.13
Descriptive statistics were presented to characterize the population of visits, both overall and by early DNR status. Formal unadjusted statistical comparisons by DNR status were not presented because of the large sample size and high likelihood of statistically significant but clinically unimportant differences. For each hospital, the mean proportion of ischemic stroke visits with an early DNR order was calculated and summarized at the hospital level by quintiles. To determine whether this variation was because of variation in patient characteristics, we built a multilevel logistic regression model with early DNR status as the independent variable and patient demographics, vascular risk factors, all Charlson comorbidities and previous cerebrovascular events as dependent variables, and a random hospital-level intercept. The proportion of variance accounted for at the hospital level was calculated using the interclass correlation coefficient.
Mortality Modeling and Hospital Classification
Our mortality model was designed to reflect CMS mortality models generally; however, 2 main differences in the data sets necessitated some modification. First, our data set includes all stroke admissions and is not limited to the Medicare fee-for-service population; consequently, we adjusted for insurance status. Second, our data set only includes hospitalization data and does not include outpatient data or data on whether a patient was consistently in California. To account for the fact that some of the risk adjustors included in the CMS models may be less reliably measured in the context of a single hospital claim, we adjusted for comorbidities by including all individual level Charlson comorbidities11 that are reliably measured in hospital discharge data, following the approach of Lichtman et al.14
Our baseline mortality model used multilevel logistic regression to predict the primary outcome (inpatient mortality) while including patient-level risk adjustors (age, sex, race, primary insurance, all individual level Charlson comorbidities, vascular risk factors, previous cerebrovascular events) whether a patient was transferred in and a random hospital-level intercept (non-DNR model). The random hospital intercept can be interpreted as the effect of an individual hospital on mortality, independently of all other risk adjustors. To determine how hospital effects change when accounting for DNR status, we repeated our baseline model by including an indicator variable for early DNR order at the patient level (DNR model). Hospital intercepts were then ranked on both models and compared using descriptive statistics. To identify the highest and lowest performing hospitals, we identified above average, average, and below average hospitals through both models using various cut points, specifically the highest and lowest performing 2.5%, 5%, and 10% of hospitals in each model. We then determined the degree to which hospital classification changed between the 2 models. Although this approach is conceptually similar, it is not exactly the same as the CMS approach for other disease processes (which compares the 95% confidence interval for hospital mortality to an average value); we opted for this approach to better understand the degree of hospital reclassification because not all approaches to hospital classification rely on the CMS cut points.1,15
Between 2005 and 2011, we identified a total of 252 368 admissions for ischemic stroke at 355 hospitals. Overall in-hospital case fatality was 5.9%. A total of 13.3% of patients admitted to California hospitals had a DNR order placed within the first 24 hours of hospital admission. The proportion of hospitalizations with early DNR orders was generally similar over time, from 13.5% in 2005 to 13.1% in 2011.
Patients with early DNR orders generally tended to be older, more likely to be white, and more likely to be women when compared with patients without early DNR orders (Table 1). Comorbid medical conditions that might affect early mortality also tended to be more common among patients with early DNR orders, including a history of myocardial infarction, congestive heart failure, dementia, chronic obstructive pulmonary disease, kidney disease, and cancer (including metastatic disease).
Inpatient mortality was higher in patients with early DNR orders (19.2% versus 3.9%; P<0.001). The use of certain life-sustaining interventions, including gastrostomy tube placement and intubation, were seen more commonly in the early DNR population.
Variation in the Use of DNR Orders and Association With Inpatient Mortality
Substantial interhospital variability was seen in the use of early DNR orders (Figure 1). In an unadjusted analysis, the use ranged from 2.1% of hospitalizations (quintile 1) to 32.0% of hospitalizations (quintile 5). After adjusting for patient-level factors, variation was somewhat more constrained, ranging from 2.2% (quintile 1) to 23.2% (quintile 5). The intraclass correlations were 0.26 (unadjusted) and 0.24 (adjusted), meaning that between 24% and 26% of the variance seen in early DNR use was explained at the hospital level. Hospitals with higher early DNR use had higher unadjusted inpatient mortality than hospitals with lower early DNR use, as mortality more than doubled from quintile 1 (4.2%) to quintile 5 (8.7%).
Effect of DNR Orders on Mortality Estimates and Hospital Rankings
In the DNR model, DNR status was a strong predictor of mortality, with an odds ratio of 4.71 (95% confidence interval, 4.51–4.90). As a consequence, the DNR model tended to have a lower estimate of adjusted hospital mortality when compared with that of the baseline non-DNR model. Relative decreases of ≥10% were observed in 190 hospitals, by ≥20% in 37 hospitals, and by ≥30% in 7 hospitals (maximum relative change, –42% and maximum absolute change, –1.9%). The DNR model resulted in relative increases in mortality estimates of ≥10% in 84 hospitals, by ≥20% in 32 hospitals, and by ≥30% in 3 hospitals (maximum relative change, +43% and maximum absolute change, +2.5%).
Across the rank spectrum, hospital rank varied between the 2 predictive models (Figure 2). Using a classification scheme that identifies high-performing and low-performing hospitals by their presence in the highest or lowest 2.5% of hospital mortality, the DNR model would result in reclassification of 6/355 hospitals (1.7%) when compared with the classification on the baseline, non-DNR model. Classification schemes that identified the highest and lowest 5% or 10% of hospitals would result in reclassification of 18/355 (5.0%) and 56/355 (15.8%) of hospitals on the DNR model when compared with those of the non-DNR model, respectively (Table 2). Focusing on the subset of hospitals categorized as below average using the non-DNR model, 22% (2/9) would be reclassified using the 2.5% threshold, 28% (5/18) would be reclassified using the 5% threshold, and 36% (13/36) using the 10% threshold in the DNR model.
Failure to account for DNR status can lead to considerable differences in estimates of adjusted inpatient mortality in ischemic stroke. Our data suggest that the difference in these estimates will affect hospital classification in public-reporting systems that are based on adjusted mortality. The degree of reclassification is not trivial because we found that ≈20% to 40% of hospitals classified as low performing in the baseline non-DNR model were reclassified as average once DNR orders were taken into account. A major goal of public reporting is to identify lower performing hospitals so that improvement efforts can be undertaken; given the association we found between early DNR status and mortality, unless one assumes that DNR status is a marker of low-quality care, the system may not be accomplishing this objective.
More broadly, our findings further call into question whether adjusted hospital-level ischemic stroke mortality is a valid measure of quality of care and the widespread use of publicly reporting hospital performance until the factors that drive stroke mortality are better understood. Stroke may be different from other conditions for which CMS publicly reports mortality (eg, congestive heart failure) in several important ways. First, stroke is responsible for a large burden of disability,16 and many patients with stroke and their families opt against high-intensity care focused on improving survival when facing survival with severe disability and delayed death.17,18 These differences imply that early mortality after stroke is not always the worst outcome from a patient perspective and that hospital responses to public reporting could potentially disincentivize patient-centered, preference-sensitive care.
Second, high-quality evidence-based acute stroke care (eg, early thrombolytic administration, use of statins) is more strongly linked to disability reduction or secondary prevention as opposed to mortality. This study and other data show that even small changes in the intensity of care after stroke may result in a substantial effect on mortality; as much as 40% of mortality was attributed to withdrawal of life-sustaining interventions in one study.3 Given that attempts to improve evidence-based process measures may have more modest effects on early stroke mortality than even small changes in the use of life-sustaining treatments; institutions may make conscious or unconscious efforts to increase the intensity of care provided to patients with stroke, at the cost of patient-centeredness. Similar unintended consequences have been observed in other contexts when faced with problematic incentives.19,20
Beyond the implications for national hospital comparisons and whether mortality (with or without accounting for DNR status) is a valid marker of quality of care, this study raises other issues. Among hospitals in California, there is wide variation in the proportion of hospitalized patients with ischemic stroke and early DNR orders, consistent with other studies of DNR use. This difference in DNR use is associated with concomitant wide variation in inpatient stroke mortality, resulting in these questions: Is excess mortality occurring at high DNR use, high mortality hospitals? Or alternatively, are limitations on intensity of care not being established frequently enough at low DNR use, low mortality hospitals? Is it even possible to define acceptable and unacceptable levels of hospital-based mortality, without understanding patient preferences on DNR status and the use of life-sustaining interventions?21 Future research is needed to develop a more detailed understanding of the patients that receive DNR orders and how DNR orders (and care limitations more broadly) are currently used in stroke care to begin addressing these questions. In addition, more work is needed to understand why hospitals account for 25% of the variance in DNR orders, to determine if this is because of hospital-level processes or cultures, provider-specific factors, or local/regional variability in patient preferences for life-sustaining treatments.22
The use of DNR orders does not preclude high-quality care or the use of high-intensity interventions after stroke.3,23 A specific designation as care being palliative or comfort only was not available in our data set, but given the number of DNR patients who underwent life-sustaining interventions (eg, percutaneous endoscopic gastrostomy placement), it does not seem that early DNR use in ischemic stroke should be considered a direct proxy for palliative care. In patients with intracerebral hemorrhage, most patients in whom life-sustaining measures were withdrawn also had early DNR orders; however, most DNR patients did not ultimately have life-sustaining measures withdrawn or withheld.24 It is imperative that future work more thoroughly investigates the association between stroke mortality and decisions on the intensity of care after stroke, including DNR orders. Ideally, such work would clarify the circumstances on DNR status and life-sustaining measure use after stroke and determine whether patient preferences were incorporated into decision making. Assessing the quality of decision making25 may help explain practice variation in some disease processes (such as coronary artery disease), and given the association between early stroke mortality and life-sustaining intervention use, it may explain some variation in outcomes as well.
Our study has several limitations. Information on stroke severity was not available in this data set although this is also a key limitation in current stroke mortality measures under development.26 In this study, DNR orders are likely acting as proxies for stroke severity to some extent because patients with severe stroke are more likely to have DNR orders. However, it is likely that the DNR effect observed here is, at least, in part independent of the effect of severity for 2 reasons. First, the magnitude of hospital reclassification when accounting for DNR use is greater than the magnitude of reclassification associated with accounting for stroke severity.26 Second, DNR use varies considerably more widely at the hospital level than does stroke severity.27 The models that we used are an approximation of the model under development by CMS, so some differences in risk adjustment methodologies are possible. We also used inpatient mortality as opposed to a 30-day time horizon for mortality that will be used by CMS because we did not have access to data beyond the incident hospitalization; it is unclear how these different timeframes would affect our results.
There is likely to be considerable hospital reclassification in publicly reported measures that judge hospital quality based on early stroke mortality without accounting for DNR orders. Given the variable potential meanings and implications of DNR orders, models that incorporate DNR orders without an assessment of concordance with patient preferences would also be limited in their ability to assess treatment quality. Therefore, the use of stroke mortality as a hospital-level quality measure should be reconsidered until the relationship of care limitations, quality of care, and mortality is better understood.
Sources of Funding
Dr Kelly: Donald W. Reynolds Foundation; Dr. Zahuranec: National Institutes of Health (NIH) grant K23 AG038731; Dr Morgenstern: NIH grants R01 NS038916 and R01 NS062675; Dr Burke: NIH grant K08 NS082597.
- Received December 18, 2013.
- Revision received January 9, 2013.
- Accepted January 10, 2014.
- © 2014 American Heart Association, Inc.
- Kelly A,
- Thompson JP,
- Tuttle D,
- Benesch C,
- Holloway RG
- Goldstein LB
- Tirschwell DL,
- Longstreth WT Jr..
- 8.↵The California Hospital Discharge Data Reporting Manual. 3rd ed. Sacramento, CA: Office of Statewide Planning and Development; 1999–2000.
- 12.↵Healthcare cost and utilization project clinical classification software (CCS) for ICD-9-cm and ICD-10 administrative data. Agency for Healthcare Research and Quality. http://www.hcup-us.ahrq.gov/toolssoftware/ccs/ccs.jsp. Accessed June 12, 2012.
- 13.↵Healthcare cost and utilization project supplemental variables for revisit analyses. Agency for Healthcare Research and Quality. http://www.hcup-us.ahrq.gov/toolssoftware/revisit/revisit.jsp. Accessed June 12, 2012.
- 15.↵How we ranked the best hospitals 2013–14. U.S. News and World Report. http://health.usnews.com/health-news/best-hospitals/articles/2013/07/16/how-we-ranked-the-best-hospitals-2013-14-an-faq. Accessed December 13, 2013.
- Go AS,
- Mozaffarian D,
- Roger VL,
- Benjamin EJ,
- Berry JD,
- Borden WB,
- et al
- Hanger HC,
- Fogarty B,
- Wilkinson TJ,
- Sainsbury R
- Werner RM,
- Asch DA,
- Polsky D
- Fonarow GC,
- Smith EE,
- Reeves MJ,
- Pan W,
- Olson D,
- Hernandez AF,
- et al