Abstract

Sum-product networks (SPNs) are an expressive deep probabilistic architecture with solid theoretical foundations, which allows tractable and exact inference. SPNs always act as black-box inference machine in many artificial intelligence tasks. Due to their recursive definition, SPNs can also be naturally employed as hierarchical feature extractors. Recently, SPNs have been successfully employed as autoencoder framework in representation learning. However, SPNs autoencoder ignores the model structural duality and trains the models separately and independently. In this work, we propose a Dual-SPNs autoencoder which designs two SPNs autoencoders to compose as a dual form. This approach trains the models simultaneously, and explicitly exploits the structural duality between them to enhance the training process. Experimental results on several multilabel classification problems demonstrate that Dual-SPNs autoencoder is very competitive against with state-of-the-art autoencoder architectures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call