Abstract

Deep learning is known to be data-hungry, which hinders its application in many areas of science when data sets are small. Here, we propose to use transfer learning methods to migrate knowledge between different physical scenarios and significantly improve the prediction accuracy of artificial neural networks trained on a small data set. This method can help reduce the demand for expensive data by making use of additional inexpensive data. First, we demonstrate that, in predicting the transmission from multilayer photonic film, the relative error rate is reduced by 50.5% (23.7%) when the source data comes from 10-layer (8-layer) films and the target data comes from 8-layer (10-layer) films. Second, we show that the relative error rate is decreased by 19.7% when knowledge is transferred between two very different physical scenarios: transmission from multilayer films and scattering from multilayer nanoparticles. Next, we propose a multitask learning method to improve the performance of different physical scenarios simultaneously in which each task only has a small data set. Finally, we demonstrate that the transfer learning framework truly discovers the common underlying physical rules instead of just performing a certain way of regularization.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.