Abstract

The Densely Connected Network (DenseNet) has been widely recognized as a highly competitive architecture in Deep Neural Networks. And its most outstanding property is called Dense Connections, which represent each layer’s input by concatenating all the preceding layers’ outputs and thus improve the performance by encouraging feature reuse to the extreme. However, it is Dense Connections that cause the challenge of dimension-enlarging, making DenseNet very resource-intensive and low efficiency. In the light of this, inspired by the Residual Network (ResNet), we propose an improved DenseNet named Additive DenseNet, which features replacing concatenation operations (used in Dense Connections) with addition operations (used in ResNet), and in terms of feature reuse, it upgrades addition operations to accumulating operations (namely ∑ (·)), thus enables each layer’s input to be the summation of all the preceding layers’ outputs. Consequently, Additive DenseNet can not only preserve the dimension of input from enlarging, but also retain the effect of Dense Connections. In this paper, Additive DenseNet is applied to text classification task. The experimental results reveal that compared to DenseNet, our Additive DenseNet can reduce the model complexity by a large margin, such as GPU memory usage and quantity of parameters. And despite its high resource economy, Additive DenseNet can still outperform DenseNet on 6 text classification datasets in terms of accuracy and show competitive performance for model training.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.