Abstract

In this paper, it's looked upon how the two data set one of which is base data and the other is target data and by varying the sizes of these two data sets and using the transfer learning in neural networks, labeling of data is costly and time consuming, and the series of labeled instance are generally very small. It is a proposed variety of steps on how the source dataset and target dataset mapping play a crucial role in transfer learning scenarios where each scenario has its own effects. By either training, a target of classes from start or keeping the layers whichever is available within the network, for the Places 205 dataset and ImageNet and their subsequent scale down version of the datasets. By indicating how each scenario is differently supported the info size for Similar and Different datasets or Small and bigger categories where we freeze the weights for the first few layers of the network provide a good fit on the target set classes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call