Abstract
Kolmogorov-Smirnov (KS) statistic is quite popular in many areas as the major performance evaluation criterion for binary classification due to its explicit business intension. Fang and Chen (2019) proposed a novel DMKS method that directly maximizes the KS statistic and compares favorably with the popular existing methods. However, DMKS did not consider the critical problem of variable selection since the special form of KS brings great challenge to establish the DMKS estimator's asymptotic distribution which is most likely to be nonstandard. This intractable issue is handled by introducing a surrogate loss function which leads to a n-consistent estimator for the true parameter up to a multiplicative scalar. Then a nonconcave penalty function is combined to achieve the variable selection consistency and asymptotical normality with the oracle property. Results of empirical studies confirm the theoretical results and show advantages of the proposed SKS (Surrogated Kolmogorov-Smirnov) method compared to the original DMKS method without variable selection.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.