One application of the artificial intelligence (AI) technology is self-guided physical activities where a computing device acts as a trainer. One key challenge for these applications is how to measure the performance of such an AI trainer, especially when the AI trainer is run on a generic PC or a mobile device. In the spirit of the Turing test, an AI trainer should mimic the behavior of a human trainer. A good human trainer generally considers the training history and the level of the trainee when providing feedback, which requires more than body position analysis. In this work, we built a Martial Art trainer application called AIShifu that helps users practice martial art poses using Human Pose Estimation (HPE). We chose an open-source neural network called HRNET trained with MS-COCO dataset as the core of the HPE. The joint coordinates and angles were used to identify the pose being practiced by the trainee, whether the active side is left or right, and how close is the key joint angle to that from a “golden” image. We collected data from both a black belt martial artist and a novice trainee and on three Karate poses. Based on the data, it is clear that the blackbelt performed the poses more consistently. A much larger sample size was required to test how well an AI trainer can discern the difference between trainees with different levels of proficiency. This understanding forms the foundation to customize AI trainer softwares for different users.