Multi-task feature selection (MTFS) has been proven effective for reducing the curse of dimensionality in large-scale classification. Many existing MTFS methods assume that all tasks learn concurrently in a static environment without considering the dynamism of tasks in real-world scenarios. However, new tasks emerge dynamically in practical applications, meaning that the aforementioned static assumption is insufficient. In this paper, we construct a dynamic multi-task feature selection framework to achieve feature reduction for constantly arriving new tasks. First, we modify the traditional mapping by shifting from hard labels to soft labels. Unlike the conventional rigid mapping, the new flexible loss function changes the direct mapping strategy to an indirect one. Second, we use the orthogonal regularization term to constrain the independent relationship between new and old tasks. This ensures that the selected relevant features for new tasks differ from prior tasks. Finally, we integrate the flexible loss and the orthogonal regularization term in the dynamic multi-task feature selection framework. Our method outperforms nine other advanced feature selection methods in terms of effectiveness and efficiency across six datasets. For example, the ACC value of our method is almost 1% higher than the next-best method on the large-scale SUN dataset.
Read full abstract