Abstract

Existing deep transfer learning methods only consider adjusting the weight and ignore the optimization of the network structure. This study is the first to propose to adapt network structure when implementing domain adaptation. Using the idea of neural architecture search, the deep transfer network is encoded and represented as specified code. The network structure with higher adaptability is generated through genetic operators (e.g., crossover, mutation). In this study, domain adaption in transfer learning is modeled as a multi-objective optimization problem. The loss function of deep transfer network training is transformed into the fitness function of evaluating individuals. Experiments show that the proposed framework can not only improve the performance of the model in the target domain, but also reduce the forgetting of source task, which is of great significance for the online transfer learning and continuous learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call