The Hyperbolic space has garnered attention for its unique properties and efficient representation of hierarchical structures. Recent studies have explored hyperbolic alternatives to hyperplane-based classifiers, such as logistic regression and support vector machines. Hyperbolic methods have even been fused into random forests by constructing data splits with horosphere, which proved effective for hyperbolic datasets. However, the existing incorporation of the horosphere leads to substantial computation time, diverting attention from its application on most datasets. Against this backdrop, we introduce an extension of Xgboost, a renowned machine learning (ML) algorithm to hyperbolic space, denoted as PXgboost. This extension involves a redefinition of the node split concept using the Riemannian gradient and Riemannian Hessian. Our findings unveil the promising performance of PXgboost compared to the algorithms in the literature through comprehensive experiments conducted on 64 datasets from the UCI ML repository and 8 datasets from WordNet by fusing both their Euclidean and hyperbolic-transformed (hyperbolic UCI) representations. Furthermore, our findings suggest that the Euclidean metric-based classifier performs well even on hyperbolic data. Building upon the above finding, we propose a space fusion classifier called, EPboost. It harmonizes data processing across various spaces and integrates probability outcomes for predictive analysis. In our comparative analysis involving 19 algorithms on the UCI dataset, our EPboost outperforms others in most cases, underscoring its efficacy and potential significance in diverse ML applications. This research marks a step forward in harnessing hyperbolic geometry for ML tasks and showcases its potential to enhance algorithmic efficacy.