Abstract
Deep transfer learning usually hypothesizes the distribution of features are similar on training dataset, which causes a constant assumption of distance. However, this standpoint leads to an unclear relation between features and accuracy, and weakens the networks ability of overfitting prevention as well. To achieve a better result in transfer learning, we propose a dynamic model of deep transfer learning network with the influence of features in learning mission. First, we exhibit the distance and dropout rates function in a formal way. Second, we propose our model with algorithm in deep transfer learning networks. With the data of preprocessed MNIST and CIFAR-10, we conduct the reasonable experiment to compare the performance of transfer learning networks and conventional ones. The results show that the model we presented in this paper works better in both correction and productivity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.