Objective. Precise hip joint morphometry measurement from CT images is crucial for successful preoperative arthroplasty planning and biomechanical simulations. Although deep learning approaches have been applied to clinical bone surgery planning, there is still a lack of relevant research on quantifying hip joint morphometric parameters from CT images. Approach. This paper proposes a deep learning workflow for CT-based hip morphometry measurement. For the first step, a coarse-to-fine deep learning model is designed for accurate reconstruction of the hip geometry (3D bone models and key landmark points). Based on the geometric models, a robust measurement method is developed to calculate a full set of morphometric parameters, including the acetabular anteversion and inclination, the femoral neck shaft angle and the inclination, etc. Our methods were validated on two datasets with different imaging protocol parameters and further compared with the conventional 2D x-ray-based measurement method. Main results. The proposed method yields high bone segmentation accuracies (Dice coefficients of 98.18% and 97.85%, respectively) and low landmark prediction errors (1.55 mm and 1.65 mm) on both datasets. The automated measurements agree well with the radiologists’ manual measurements (Pearson correlation coefficients between 0.47 and 0.99 and intraclass correlation coefficients between 0.46 and 0.98). This method provides more accurate measurements than the conventional 2D x-ray-based measurement method, reducing the error of acetabular cup size from over 2 mm to less than 1 mm. Moreover, our morphometry measurement method is robust against the error of the previous bone segmentation step. As we tested different deep learning methods for the prerequisite bone segmentation, our method produced consistent final measurement results, with only a 0.37 mm maximum inter-method difference in the cup size. Significance. This study proposes a deep learning approach with improved robustness and accuracy for pelvis arthroplasty planning.