Abstract

A prevailing method to alleviate the computational cost is to perform analysis on a subsample of the full data. Optimal subsampling algorithm utilizes non-uniform subsampling probabilities, derived through minimizing the asymptotic mean squared error of the subsample estimator, to acquire a higher estimation efficiency for a given subsample size. The optimal subsampling probabilities for softmax regression have been studied under the baseline constraint which treats one dimension of the multivariate response differently from other dimensions. In this paper, we show that different model constraints lead to different optimal subsampling probabilities, and the summation constraint corresponds to a better subsampling strategy than the baseline constraint in terms of balancing the responses among all categories. Furthermore, we derive the asymptotic distribution of the mean squared prediction error, and minimize its asymptotic expectation to define the optimal subsampling probabilities that are invariant to model constraints. Simulations and a real data example are provided to show the effectiveness of the proposed optimal subsampling probabilities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call