Abstract

An efficient transfer learning algorithm for high-dimensional sparse logistic regression models is proposed using penalized weighted score function based on square root Lasso, which intends to prespecify the tuning parameter. Three different choices of the tuning parameter are considered in the case of fixed design matrix. With a novel weight construction, the estimator of the regression vector is showed to be consistent when the inequality with respect to the Karush-Kuhn-Tucker (KKT) optimality conditions holds with high probability and the sparsity assumption for the regression vectors is required. There is information from source data to fit target data such that with high probability tending to 1-α, which is sharper than the corresponding probability bound without using the auxiliary samples, the KKT optimality conditions hold for the asymptotic choice. To detect which sources are transferable, an efficient data-driven method is proposed, which helps avoid negative transfer in practice. Simulation studies are carried out to demonstrate the numerical performance of the proposed procedure and their superiority over some existing methods. The procedures are also illustrated by analyzing the China Migrants Dynamic Survey dataset with binary outcomes concerning the associations among different provinces.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call