Learning embeddings in hyperbolic space has gained increasing interest in the community, due to its property of negative curvature, as a way of encoding data hierarchy. Recent works investigate the improvement of the representation power of hyperbolic embeddings through kernelization. However, existing developments focus on defining positive definite (pd) kernels, which may affect the intriguing property of hyperbolic spaces. This is due to the structures of hyperbolic spaces being modeled in indefinite spaces (e.g., Kreĭn space). This paper addresses this issue by developing adaptive indefinite kernels, which can better utilize the structures in the Kreĭn space. To this end, we first propose an adaptive embedding function in the Lorentz model and define indefinite Lorentz kernels (iLks) via the embedding function. Due to the isometric relationship between the Lorentz model and the Poincaré ball, these iLks are further extended to the Poincaré ball, resulting in the development of what are termed indefinite Poincaré kernels (iPKs). We evaluate the proposed indefinite kernels on a diversity of learning scenarios, including image classification, few-shot learning, zero-shot learning, person re-identification, knowledge distillation, etc. We show that the proposed indefinite kernels can bring significant performance gains over the baselines and enjoy better representation power from RKKSs than pd kernels.