Abstract
Multi-task feature selection (MTFS) has been proven effective for reducing the curse of dimensionality in large-scale classification. Many existing MTFS methods assume that all tasks learn concurrently in a static environment without considering the dynamism of tasks in real-world scenarios. However, new tasks emerge dynamically in practical applications, meaning that the aforementioned static assumption is insufficient. In this paper, we construct a dynamic multi-task feature selection framework to achieve feature reduction for constantly arriving new tasks. First, we modify the traditional mapping by shifting from hard labels to soft labels. Unlike the conventional rigid mapping, the new flexible loss function changes the direct mapping strategy to an indirect one. Second, we use the orthogonal regularization term to constrain the independent relationship between new and old tasks. This ensures that the selected relevant features for new tasks differ from prior tasks. Finally, we integrate the flexible loss and the orthogonal regularization term in the dynamic multi-task feature selection framework. Our method outperforms nine other advanced feature selection methods in terms of effectiveness and efficiency across six datasets. For example, the ACC value of our method is almost 1% higher than the next-best method on the large-scale SUN dataset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.