Infrared (IR) thermography (IRT) is a non-destructive testing inspection technique widely used in numerous applications for void and defect detection in various materials such as fiber-reinforced composites. The IRT implementation requires solving the inverse heat transfer problem, i.e., calculating the defect attributes (e.g., shape, size, location) from a measured temporal and spatial surface temperature variation. This inverse problem, unlike its equivalent forward problem, is ill-posed and the uniqueness of the solution is not proven. To tackle this challenge, the k-nearest neighbors (k-NN) machine learning (ML) algorithm is employed to provide a model for predicting a penny-shaped defect size, thickness, and location in composite laminates. The study is based on simulations and synthetic data produced by ABAQUS finite element analysis (FEA) of the heat transfer model in defective composites to train the ML algorithm. The surface temperature vs. time and vs. distance diagrams are extracted from the FEA. The data diagrams are then used to extract the training features of the ML by considering the physics of the problem. This ML is trained by 502 FEA run data sets where firstly 10 features, and then 4 features, are selected from the mentioned FEA diagrams to predict the defect traits.