Abstract

Discrete choice experiments (DCEs) are economic tools that elicit the stated preferences of respondents. Because of their increasing importance in informing the design of health products and services, it is critical to understand the extent to which DCEs give reliable predictions outside of the experimental context. We systematically reviewed the literature of published DCE studies comparing predictions to choices made in reality; we extracted individual-level data to estimate a bivariate mixed-effects model of pooled sensitivity and specificity. Eight studies met the inclusion criteria, and six of these gave sufficient data for inclusion in a meta-analysis. Pooled sensitivity and specificity estimates were 88% (95% CI 81, 92%) and 34% (95% CI 23, 46%), respectively, and the area under the SROC curve (AUC) was 0.60 (95% CI 0.55, 0.64). Results indicate that DCEs can produce reasonable predictions of health-related behaviors. There is a great need for future research on the external validity of DCEs, particularly empirical studies assessing predicted and revealed preferences of a representative sample of participants.

Highlights

  • Discrete choice experiments (DCEs) ask participants to make choices between hypothetical alternatives, using choice modeling methods to analyze data

  • This paper considers variations in external validity in health DCEs attributable to hypothetical bias, and is the first systematic review and meta-analysis assessing the ability of DCEs to predict health behaviors

  • The area under the summary receiver operating characteristic (SROC) curve (AUC) can be a useful summary statistic of predictive ability, and the AUC we present in Fig. 4 (0.60 [95% CI 0.55, 0.64]) provides further evidence that DCEs have a moderate ability to predict choices; there are no firm limits for “good” AUCs, meta-analyses of diagnostic tests infer a similar conclusions [63, 64]

Read more

Summary

Introduction

Discrete choice experiments (DCEs) ask participants to make choices between hypothetical alternatives, using choice modeling methods to analyze data. They are attractive tools for research and policy as they offer a flexible methodology to estimate which attributes are important in decisionmaking. Data are analyzed using discrete choice models [1], the results of which can be used to determine the relative importance of different attributes to respondent choices. Because respondents are not obliged in reality to make the choices they indicate in a DCE, hypothetical bias may reduce the usefulness of DCE results [17, 18]. There has been very little empirical work assessing whether choices made in DCEs reflect those made in reality, or the circumstances in which they may offer more or less reliable inference [19]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call