This article introduces a non-intrusive method to estimate facial muscle activity from images, diverging from conventional electrode-based approaches. Our methodology capitalizes on an inclusive set of features encompassing a diverse range of facial muscles, often overlooked in research, thus significantly expanding the scope of analyzing muscle activity within facial expressions. Our method is based on the standard 68-point face landmark and extends it by identifying the interactions of these muscles when a person performs a specific facial expression. These interactions are recorded in a feature vector, which is used by three classifiers, Linear Discriminant (LD), Support Vector Machine (SVM) and Multi-layer Perceptron (MLP) to classify six facial expressions (anger, disgust, fear, happiness, neutrality, and sadness). The method’s validation is conducted with three databases: FACES, KDEF, and JAFFE. These databases present different challenges; the first contains faces of individuals from three age ranges and both genders displaying facial expressions. On one hand, the KDEF database contains images of both genders but only within the range of young adults, whereas JAFFE comprises solely female faces. Additionally, JAFFE presents another drawback, representing only 10% of the images contained in FACES. In all these cases, our method yields excellent results, occasionally achieving 100% classification, especially in the young category. It is also noteworthy that the best results were obtained in the classification of facial expressions in female faces, suggesting that women tend to be more expressive. The method was thoroughly evaluated, and the results demonstrate the robustness of the approach, showcasing its good performance across three different databases and also for faces of individuals across various age groups. Numerical metrics, including accuracy, precision, recall, and F1 score, are presented along with confusion matrices.