Abstract

The present paper aims to propose a new type of model compression method to self-compress multi-layered neural networks into single-layered neural networks. Then, the compressed single-layered neural networks, by the help of mutual information control, can be easily and naturally interpreted with improved generalization. For interpreting the inference mechanism, we have proposed the self-compression method, applied directly to multi-layered neural networks, to produce compressed or collective weights. However, the interpretation of compressed networks tends to be instable or changeable due to the existence of multiple layers, and the need to simplify and stabilize the compressed networks and the corresponding weights has arisen. In this context, the present paper aims not to directly and immediately compress multi-layered networks but to compress multi-layered neural network into single neural networks, and then the single neural networks are forced to be simplified as much as possible by mutual information control, and they are further compressed into non-layered networks. The method was applied to two well-studied data sets: the teaching assistant evaluation data set and the inference of occupancy status. The results confirmed two important points. First, generalization performance could be improved by transferring information from multi-layered neural networks to single-layered networks. Second, compressed or collective connection weights were easily and naturally interpreted.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.