Abstract

In recurrent language models the usage of the class hierarchy of vocabulary is a major direction to overcome over-large vocabulary issue, yet the hierarchy is not aligned within the models, including the embedding, hidden and softmax layer. Currently most methods employ the hierarchical information in embedding and/or softmax layers. It is interesting to ask if incorporating such information into hidden layer will be beneficial to the overall language modeling performance. Therefore, in this research we propose a dual channel class hierarchy (DCCH) model that utilizes two channels of RNNs to form a class hierarchy within the model, where class-channel is used to capture class sequence’s information. Furthermore, we study two auxiliary techniques in class organization: word hierarchy initialization and class exchange, to boost the overall performance. Finally, experiments on the PTB, WikiText-103, Wiki-fr and OBW datasets evaluate the potential of proposed model and our observation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.