This paper presents a comparative study of three models including virtual point detector (VPD), curve fitting (CF), and machine learning (ML) in determining the full-energy peak efficiency (FEPE) of HPGe (High Purity Germanium) detector with different energy point sources located on the symmetric axis of the detector at different source-to-detector distances. Firstly, the FEPEs were determined for energies spanning from 40 keV to 2500 keV and distances ranging from 1.88 cm to 25.68 cm, utilizing the MCNP6 code. These simulated data were then used to construct the VPD, CF, and ML models. The accuracy of each model was assessed by comparing its efficiency calibration results against experimental data. In the VPD and CF models, the obtained results are fairly accurate for energies above 240 keV, with relative deviations between calculated and experimental FEPEs (RDExp/Cal) of less than 4 %. However, for energies below 240 keV and at small source-to-detector distances, the RDExp/Cal is quite large, reaching up to 10 % with the VPD model and 5 % with the CF model. In the ML model, the maximum difference between the predicted and experimental FEPEs is 2.4 % for the whole investigated energy range and distances. Our study shows that the ML model outperforms VPD and CF models in terms of accuracy, particularly in situations with low energies (below 240 keV) and at positions close to the detector surface, where the VPD and CF models often give results with low accuracy. Therefore, the ML model can be considered a useful and highly reliable computational solution for calculating the FEPE of the HPGe detector.