Abstract

HomeCirculationVol. 118, No. 13Public Reporting of 30-Day Mortality for Patients Hospitalized With Acute Myocardial Infarction and Heart Failure Free AccessReview ArticlePDF/EPUBAboutView PDFView EPUBSections ToolsAdd to favoritesDownload citationsTrack citationsPermissions ShareShare onFacebookTwitterLinked InMendeleyReddit Jump toFree AccessReview ArticlePDF/EPUBPublic Reporting of 30-Day Mortality for Patients Hospitalized With Acute Myocardial Infarction and Heart Failure Harlan M. Krumholz, MD, SM and Sharon-Lise T. Normand, PhD Harlan M. KrumholzHarlan M. Krumholz From the Section of Cardiovascular Medicine and the Robert Wood Johnson Clinical Scholars Program, Department of Internal Medicine; Section of Health Policy and Administration, School of Public Health, Yale University School of Medicine; and Center for Outcomes Research and Evaluation, Yale-New Haven Hospital, New Haven, Conn (H.M.K.); and Department of Health Care Policy, Harvard Medical School, and Department of Biostatistics, Harvard School of Public Health, Boston, Mass (S.-L.T.N.). Search for more papers by this author and Sharon-Lise T. NormandSharon-Lise T. Normand From the Section of Cardiovascular Medicine and the Robert Wood Johnson Clinical Scholars Program, Department of Internal Medicine; Section of Health Policy and Administration, School of Public Health, Yale University School of Medicine; and Center for Outcomes Research and Evaluation, Yale-New Haven Hospital, New Haven, Conn (H.M.K.); and Department of Health Care Policy, Harvard Medical School, and Department of Biostatistics, Harvard School of Public Health, Boston, Mass (S.-L.T.N.). Search for more papers by this author Originally published25 Aug 2008https://doi.org/10.1161/CIRCULATIONAHA.108.804880Circulation. 2008;118:1394–1397Other version(s) of this articleYou are viewing the most recent version of this article. Previous versions: August 25, 2008: Previous Version 1 For cardiac care, the Centers for Medicare &Medicaid Services (CMS) has posted information about how hospitals perform in delivering specific types of care since 2004. These quality measures (the Table) include information about how often eligible patients are treated with aspirin, β-blockers, angiotensin-converting enzyme inhibitors, or angiotensin receptor blockers; how often smokers are counseled to quit; how rapidly hospitals provide reperfusion therapy for patients with an ST-segment elevation myocardial infarction; and how often the ejection fraction is measured for patients with heart failure. All of these measures assess care that is strongly recommended by guidelines. The measures include only those patients who should receive this care and are reported at the hospital level simply as percentages. The performance on these measures should ideally be 100%, and the rates have improved markedly over time. ; Table. Acute Myocardial Infarction and Heart Failure Process-of-Care MeasuresACE indicates angiotensin-converting enzyme; ARB, angiotensin receptor blocker; LVSD, left ventricular systolic dysfunction; and PCI, percutaneous coronary intervention.Modified from Hospital Compare (http://hospitalcompare.hhs.gov), with permission.Heart attack process-of-care measures Percent of heart attack patients given ACE inhibitor or ARB for LVSD Percent of heart attack patients given aspirin at arrival Percent of heart attack patients given aspirin at discharge Percent of heart attack patients given β-blocker at arrival Percent of heart attack patients given β-blocker at discharge Percent of heart attack patients given fibrinolytic medication within 30 min of arrival Percent of heart attack patients given PCI within 90 min of arrival Percent of heart attack patients who reported smoking given smoking cessation advice/counselingHeart failure process-of-care measures Percent of heart failure patients given ACE inhibitor or ARB for LVSD Percent of heart failure patients given an evaluation of LVS function Percent of heart failure patients given discharge instructions Percent of heart failure patients who reported smoking given smoking cessation advice/counselingOn June 21, 2007, CMS expanded the measures to include outcomes and posted information on its “Hospital Compare” website1 regarding hospital-specific 30-day mortality rates for patients with acute myocardial infarction and heart failure. These measures used administrative claims data and were validated against measures using medical record data; the measures were approved by the National Quality Forum.2,3 In 2007, the publicly reported information was limited to describing hospitals as having rates that were higher, lower, or no different than the national average. Hospitals received information about their rates, their patients, and comparisons with other hospitals in their state and with the nation.This year, CMS is expanding the publicly available information. For acute myocardial infarction, heart failure, and now pneumonia, CMS will post the calculated risk-standardized 30-day mortality rates (RSMRs) and the 95% interval estimates for each hospital that treated Medicare patients with these conditions; hospital volume for each condition also will be posted. As before, “Hospital Compare” will categorize hospital performance as better than, worse than, and no better than the national rate. This year’s report is based on admissions from July 1, 2006, through June 30, 2007. This Clinician Update provides information about these measures and how they should be used.Why Report Outcomes?Quality measures have traditionally focused on whether certain actions are performed. The development of such measures, called process measures, generally follows a standardized methodology and is based on recommendations from clinical practice guidelines.4 CMS and The Joint Commission have publicly reported process measures for acute myocardial infarction, heart failure, and pneumonia since 2005, and performance on these measures has increased steadily since that time.5The need to consider patient outcomes—what actually happens to patients—in quality measurement is based on several observations.6 The process measures provide only a narrow, yet important, perspective on quality of care; there are many more decisions and processes that occur in the course of caring for patients. In addition, for some current process measures, there is little variation in performance, leaving little opportunity for improvement, but there is evidence of clinically important differences among hospitals in their 30-day mortality rates.2,3 In the absence of outcome measures, hospitals and clinicians have no way to assess and benchmark overall clinical performance from the patient’s perspective.Why Use All-Cause 30-Day Mortality?All-cause mortality, rather than mortality from cardiac causes, was selected because it is what matters most to patients. Moreover, noncardiac deaths may still be related to the quality of care provided during a cardiac hospitalization, eg, preventable infections related to hospitalization that could be the proximate cause of death.The measure assesses mortality within 30 days from admission. This standardized period was chosen to ensure a fair assessment of all hospitals and to prevent differences in transfer rates or variations in length of stay from affecting the measurement.What Is the Attribution Rule?If a patient is transferred, mortality (or survival) is attributed to the hospital to which the patient was initially admitted. The hospital that transfers the patient has control over where the patient is sent. Attribution to the initial hospital encourages feedback about patient outcomes and a sense of collaboration in improving the transfer process, communication of information, and effort coordination. It also avoids the perverse incentive for hospitals to transfer patients who are doing poorly to avoid accountability for the death.Are Claims Data Valid?Prior studies have indicated that patient-level factors in administrative claims may not agree with information from the medical record.7 CMS therefore tested its models that used claims data against models using medical record data. In this case, the validation was not performed by comparing individual variables but by determining whether the RSMRs using claims models could serve as proxies for the RSMRs produced from models using medical record data, acknowledging that even the medical record model has limitations. The claims-based RSMRs had close agreement to the medical record-based RSMRs. This validation provided the impetus to proceed with the effort to publicly report outcomes using the claims data, the only national source of data about patient outcomes.How Are Rates Calculated?The RSMRs are calculated from a “hierarchical” regression model. Regression models use information about disease severity and comorbidity to predict outcomes. Hierarchical regression models also use disease severity information but are used when the primary objective is to compare hospital outcomes rather than patient outcomes. The hierarchical model accounts for the similarity of outcomes within a hospital that may be due to hospital quality. Each hospital RSMR is similar to an “observed”-to-“expected” ratio that is then multiplied by the national average so that rates rather than ratios are reported. The RSMR, as a specific number, is our best estimate of the hospital’s rate. The rate is best understood by comparing it with the national average. If the hospital’s rate is 14% and the national rate is 16%, then the hospital is doing better than what might be expected given the patients it has.CMS reports a 95% interval estimate for each hospital RSMR and uses this interval to classify hospitals. The interval estimates incorporate the uncertainty in the estimation and report a range that encompasses the hospital’s true RSMR within a certain degree of probability. What defines an appropriate interval estimate, 80%, 95%, or 99%, may vary depending on the preference of the users. The interval estimate conveys the uncertainty around the estimate. If the interval estimate goes from 11% to 17% and the national rate is 16%, then we would say that the hospital could even be much better than expected with a rate as good as 11% or slightly worse than average at 17%. In this case, because the interval estimate overlaps the national average, we cannot say with great certainty that it is better than average, even though our best estimate is that it is. If the range were entirely below the national average, for example, if the upper limit of the interval estimate were 15%, then we would have a high degree of certainty that it is better than average.How Should the RSMR Be Interpreted?Despite being on the “Hospital Compare” website, direct comparisons of specific RSMRs are generally not valid. Each hospital RSMR quantifies how a hospital has performed with the patients that hospital treated. It would not be proper to compare hospitals with markedly different types of patients. The best interpretation of the results is for a given hospital to assess how it fared compared with the national average.8How Are Small-Volume Hospitals Handled?Small institutions provide very little information on which to estimate performance. Options available to CMS included excluding small hospitals entirely, grouping them into 1 category and reporting an aggregate rate, or including them as individual hospitals and reflecting their accuracy. Because the hierarchical model is used when the primary objective is to compare hospitals, it avoids the problem of spuriously identifying small-volume hospitals as performing better than or worse than expected. The current iteration of the measures is accompanied by information about how many cases were considered in the calculations.What Are the Limitations of the Measures?Because of their reliance on administrative claims data, there is a delay in the availability of the data to make the calculation. In general, the publicly reported measures represent performance that occurred 12 to 24 months before the report. In addition, the models may not account for all important unmeasured factors that may differentiate a particular institution. It is possible that patient selection may contribute to the results in ways that are not accounted for by the model.What Should Clinicians Do?Despite the limitations of the mortality measures, they hold the promise of revealing potential opportunities to improve care. Every clinician should see the reports detailing performance that have been given to hospitals, yet unfortunately many hospitals have not downloaded the reports for review. Clinicians can request that their administrators obtain the reports through “QualityNet,” the website through which CMS distributes confidential quality information to hospitals.The goal is to use measurement as a tool to create an imperative to improve and to provide perspective regarding performance. Clinicians should engage constructively in this effort and should examine adverse outcomes within their institution. The effort is undertaken with the understanding that most deaths are neither preventable nor the result of substandard care. Nevertheless, there is undeniable evidence that some deaths in our healthcare system would have been preventable had optimal care been provided. For example, there is evidence of delays in treatment, medication misdosing, and misdiagnoses. A recent study in Ontario examined deaths among patients who underwent bypass surgery at various institutions.9 Although the Ontario hospitals had favorable mortality rates, about a third of the deaths were deemed preventable had optimal care been provided. That analysis indicated opportunities for improvement in the care of these patients. Such an approach could be generalizable to other conditions.How Should Patients and the Community Use the Information?Patients may use the information about RSMRs in discussions with clinicians about hospital performance and the initiatives that are being adopted to ensure high-quality care. The community and hospital boards may take an interest, and the measures may provide an incentive to hospitals to invest more fully in delivering high-quality care and improving the essential systems.Patients should not use the measures to shop for hospitals at the time of an acute illness. All patients should be instructed to call 911 for emergencies and proceed to the nearest hospital. The measures should not cause delay in seeking care, and it is not their purpose to be used in that setting.Additional ResourcesAdditional reading on the measure methodology, rationale for outcomes measurement, or method of hierarchical modeling can be found in the following sources:Normand S-LT, Shahian DM. Hierarchical modeling: statistical and clinical aspects of institutional profiling. Stat Sci. 2007;22:206–226.Normand S-LT, Wang Y, Krumholz HM. Assessing surrogacy of data sources for institutional comparisons. Health Serv Outcomes Res Method. 2007;7:79–96.Hospital-specific report: mock report: acute myocardial infarction 30-day mortality measure; heart failure 30-day mortality measure; pneumonia 30-day mortality measure. Available at: http://qualitynet.org/dcs/ContentServer?cid=1205442091862&pagename=QnetPublic%2FPage%2FQnetTier3&c=Page.DisclosuresDr Krumholz has contracts with the Colorado Foundation for Medical Care to develop outcomes and surveillance measures for public reporting. Dr Normand is funded by the Massachusetts Department of Public Health to monitor the quality of care after cardiac surgery or percutaneous coronary intervention.FootnotesCorrespondence to Dr H.M. Krumholz, Yale University School of Medicine, 333 Cedar St, PO Box 208088, New Haven, CT 06520–8088. E-mail [email protected] References 1 US Department of Health and Human Services. Hospital compare. Available at : http://www.hospitalcompare.hhs.gov. Accessed July 4, 2008.Google Scholar2 Krumholz HM, Wang Y, Mattera JA, Wang Y, Han LF, Ingber MJ, Roman S, Normand SL. An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with an acute myocardial infarction. Circulation. 2006; 113: 1683–1692.LinkGoogle Scholar3 Krumholz HM, Wang Y, Mattera JA, Wang Y, Han LF, Ingber MJ, Roman S, Normand SL. An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with heart failure. Circulation. 2006; 113: 1693–1701.LinkGoogle Scholar4 Spertus JA, Eagle KA, Krumholz HM, Mitchell KR, Normand SL. American College of Cardiology and American Heart Association methodology for the selection and creation of performance measures for quantifying the quality of cardiovascular care. Circulation. 2005; 111: 1703–1712.LinkGoogle Scholar5 Williams SC, Schmaltz SP, Morton DJ, Koss RG, Loeb JM. Quality of care in US hospitals as reflected by standardized measures, 2002–2004. N Engl J Med. 2005; 353: 255–264.CrossrefMedlineGoogle Scholar6 Krumholz HM, Normand SL, Spertus JA, Shahian DM, Bradley EH. Measuring performance for treating heart attacks and heart failure: the case for outcomes measurement. Health Aff (Millwood). 2007; 26: 75–85.CrossrefMedlineGoogle Scholar7 Jollis JG, Ancukiewicz M, DeLong ER, Pryor DB, Muhlbaier LH, Mark DB. Discordance of databases designed for claims payment versus clinical information systems: implications for outcomes research. Ann Intern Med. 1993; 119: 844–850.CrossrefMedlineGoogle Scholar8 Shahian DM, Normand S-LT. Comparison of “risk-adjusted” hospital outcomes. Circulation. 2008; 117: 1955–1963.LinkGoogle Scholar9 Guru V, Tu JV, Etchells E, Anderson GM, Naylor CD, Novick RJ, Feindel CM, Rubens F, Teoh K, Mathur A, Bonneau D, Cutrara C, Austin PC, Fremes SE. The relationship between preventability of death after CABG surgery and all cause risk-adjusted mortality rates. Circulation. 2008; 117: 2969–2976.LinkGoogle Scholar Previous Back to top Next FiguresReferencesRelatedDetailsCited BySkyrud K, Vikum E, Hansen T, Kristoffersen D and Helgeland J (2019) Hospital Variation in 30‐Day Mortality for Patients With Stroke; The Impact of Individual and Municipal Socio‐Demographic Status, Journal of the American Heart Association, 8:14, Online publication date: 16-Jul-2019.Kini V, Peterson P, Spertus J, Kennedy K, Arnold S, Wasfy J, Curtis J, Bradley S, Amin A, Ho P and Masoudi F (2018) Clinical Model to Predict 90-Day Risk of Readmission After Acute Myocardial Infarction, Circulation: Cardiovascular Quality and Outcomes, 11:10, Online publication date: 1-Oct-2018.Wang Y, Eldridge N, Metersky M, Sonnenfeld N, Fine J, Pandolfi M, Eckenrode S, Bakullari A, Galusha D, Jaser L, Verzier N, Nuti S, Hunt D, Normand S and Krumholz H (2016) Association Between Hospital Performance on Patient Safety and 30‐Day Mortality and Unplanned Readmission for Medicare Fee‐for‐Service Patients With Acute Myocardial Infarction, Journal of the American Heart Association, 5:7, Online publication date: 6-Jul-2016.Wasfy J, Borden W, Secemsky E, McCabe J and Yeh R (2015) Public Reporting in Cardiovascular Medicine, Circulation, 131:17, (1518-1527), Online publication date: 28-Apr-2015.Epstein A, Yang L, Yang F and Groeneveld P (2014) A Comparison of Clinical Outcomes From Carotid Artery Stenting Among US Hospitals, Circulation: Cardiovascular Quality and Outcomes, 7:4, (574-580), Online publication date: 1-Jul-2014.Fonarow G, Alberts M, Broderick J, Jauch E, Kleindorfer D, Saver J, Solis P, Suter R and Schwamm L (2014) Stroke Outcomes Measures Must Be Appropriately Risk Adjusted to Ensure Quality Care of Patients, Stroke, 45:5, (1589-1601), Online publication date: 1-May-2014.Fonarow G, Saver J, Smith E, Broderick J, Kleindorfer D, Sacco R, Pan W, Olson D, Hernandez A, Peterson E and Schwamm L (2012) Relationship of National Institutes of Health Stroke Scale to 30‐Day Mortality in Medicare Beneficiaries With Acute Ischemic Stroke, Journal of the American Heart Association, 1:1, Online publication date: 21-Feb-2012.Masoudi F (2011) Reflections on Performance Measurement in Cardiovascular Disease, Circulation: Cardiovascular Quality and Outcomes, 4:1, (2-4), Online publication date: 1-Jan-2011.Jessup M, Albert N, Lanfear D, Lindenfeld J, Massie B, Walsh M and Zucker M (2011) ACCF/AHA/HFSA 2011 Survey Results: Current Staffing Profile of Heart Failure Programs, Including Programs That Perform Heart Transplant and Mechanical Circulatory Support Device Implantation, Circulation: Heart Failure, 4:3, (378-387), Online publication date: 1-May-2011.Iwashyna T, Kahn J, Hayward R and Nallamothu B (2010) Interhospital Transfers Among Medicare Beneficiaries Admitted for Acute Myocardial Infarction at Nonrevascularization Hospitals, Circulation: Cardiovascular Quality and Outcomes, 3:5, (468-475), Online publication date: 1-Sep-2010.Campbell A, Satran D, Larson D, Chavez I, Unger B, Chacko B, Kapsner C and Henry T (2009) ST-Elevation Myocardial Infarction, Circulation: Cardiovascular Quality and Outcomes, 2:6, (648-655), Online publication date: 1-Nov-2009. September 23, 2008Vol 118, Issue 13 Advertisement Article InformationMetrics https://doi.org/10.1161/CIRCULATIONAHA.108.804880PMID: 18725492 Originally publishedAugust 25, 2008 PDF download Advertisement SubjectsEthics and Policy

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call