Abstract
Learning embeddings in hyperbolic space has gained increasing interest in the community, due to its property of negative curvature, as a way of encoding data hierarchy. Recent works investigate the improvement of the representation power of hyperbolic embeddings through kernelization. However, existing developments focus on defining positive definite (pd) kernels, which may affect the intriguing property of hyperbolic spaces. This is due to the structures of hyperbolic spaces being modeled in indefinite spaces (e.g., Kreĭn space). This paper addresses this issue by developing adaptive indefinite kernels, which can better utilize the structures in the Kreĭn space. To this end, we first propose an adaptive embedding function in the Lorentz model and define indefinite Lorentz kernels (iLks) via the embedding function. Due to the isometric relationship between the Lorentz model and the Poincaré ball, these iLks are further extended to the Poincaré ball, resulting in the development of what are termed indefinite Poincaré kernels (iPKs). We evaluate the proposed indefinite kernels on a diversity of learning scenarios, including image classification, few-shot learning, zero-shot learning, person re-identification, knowledge distillation, etc. We show that the proposed indefinite kernels can bring significant performance gains over the baselines and enjoy better representation power from RKKSs than pd kernels.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.