Abstract

Since the early 1980s, one major focus of US health policymakers has been controlling the growth rate of expenditures in the US health care system. Despite numerous cost-containment efforts, advances in the ability to treat illness—particularly through new medical technology—have resulted in an increasing share of resources being consumed by the health care industry. During the year 2000, 13.1% of the US gross national product was consumed by health care, up from 8.8% in 1985 (1). Continuing concern about the impact of technological improvements on resource consumption has resulted in third-party payers and policymakers increasingly evaluating the cost of incremental improvements in health outcomes. For example, the Food and Drug Administration now requires that new drugs not only prove that they are efficacious, but also cost-effective. As a result, there has been a rapid growth in the number of cost-effectiveness analyses (CEA), a trend particularly associated with the approval of new medications and medical devices (2).

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.