Abstract
• We connect the attention mechanism and mixture of Gaussian processes • We design two novel mixture of Gaussian processes models based on the attention mechanism. • The proposed models are applicable to the case that the input variable lies on a manifold or a graph. The mixture of Gaussian processes (MGP) is a powerful model, which is able to characterize data generated by a general stochastic process. However, conventional MGPs assume the input variable obeys certain probabilistic distribution, thus cannot effectively handle the case where the input variable lies on a general manifold or a graph. In this paper, we first clarify the relationship between the MGP prediction strategy and the attention mechanism. Based on the attention mechanism, we further design two novel mixture models of Gaussian processes, which do not rely on probabilistic assumptions on the input domain, thus overcoming the difficulty of extending MGP models to manifold or graph. Experimental results on real-world datasets demonstrate the effectiveness of the proposed methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.