Height is an important health parameter employed across domains, including healthcare, aesthetics, and athletics. Numerous non-contact methods for height measurement exist; however, most are limited to assessing height in an upright posture. This study presents a non-contact approach for measuring human height in 2D space across different postures. The proposed method utilizes computer vision techniques, specifically the MediaPipe library and the YOLOv8 model, to analyze images captured with a smartphone camera. The MediaPipe library identifies and marks joint points on the human body, while the YOLOv8 model facilitates the localization of these points. To determine the actual height of an individual, a multivariate linear regression model was trained using the ratios of distances between the identified joint points. Data from 166 subjects across four distinct postures: standing upright, rotated 45 degrees, rotated 90 degrees, and kneeling were used to train and validate the model. Results indicate that the proposed method yields height measurements with a minimal error margin of approximately 1.2%. Future research will extend this approach to accommodate additional positions, such as lying down, cross-legged, and bent-legged. Furthermore, the method will be improved to account for various distances and angles of capture, thereby enhancing the flexibility and accuracy of height measurement in diverse contexts.
Read full abstract