Abstract

Kolmogorov-Smirnov (KS) statistic is quite popular in many areas as the major performance evaluation criterion for binary classification due to its explicit business intension. Fang and Chen (2019) proposed a novel DMKS method that directly maximizes the KS statistic and compares favorably with the popular existing methods. However, DMKS did not consider the critical problem of variable selection since the special form of KS brings great challenge to establish the DMKS estimator's asymptotic distribution which is most likely to be nonstandard. This intractable issue is handled by introducing a surrogate loss function which leads to a n-consistent estimator for the true parameter up to a multiplicative scalar. Then a nonconcave penalty function is combined to achieve the variable selection consistency and asymptotical normality with the oracle property. Results of empirical studies confirm the theoretical results and show advantages of the proposed SKS (Surrogated Kolmogorov-Smirnov) method compared to the original DMKS method without variable selection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call