Abstract
This paper proposes new modified methods for back propagation neural networks and uses semantic feature space to improve categorization performance and efficiency. The standard back propagation neural network (BPNN) has the drawbacks of slow learning and getting trapped in local minima, leading to a network with poor performance and efficiency. In this paper, we propose two methods to modify the standard BPNN and adopt the semantic feature space (SFS) method to reduce the number of dimensions as well as construct latent semantics between terms. The experimental results show that the modified methods enhanced the performance of the standard BPNN and were more efficient than the standard BPNN. The SFS method cannot only greatly reduce the dimensionality, but also enhances performance and can therefore be used to further improve text categorization systems precisely and efficiently.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have