Abstract
This letter considers Bayesian binary classification where data are assumed to consist of multiple time series (panel data) with binary class labels (binary choice). The observed data can be represented as {yit, xit}T,t=1i = 1, … , n. Here yit∈ {0, 1} represents binary choices, and xitrepresents the exogenous variables. We consider prediction of yitby its own lags, as well as by the exogenous components. The prediction will be based on a Bayesian treatment using a Gibbs posterior that is constructed directly from the empirical error of classification. Therefore, this approach is less sensitive to the misspecification of the probability model compared to the usual likelihood-based posterior, which is confirmed by Monte Carlo simulations. We also study the effects of various choices of n and T both numerically (by simulations) and theoretically (by considering two alternative asymptotic situations: large n and large T). We find that increasing T helps to reduce the prediction error more effectively compared to increasing n. We also illustrate the method in a real data application on the brand choice of yogurt purchases.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.