ABSTRACTMachine vision has had a substantial impact on human‐computer interaction and psychological research. The ability of robots to generate realistic facial expressions has been enhanced by advancements in artificial intelligence, neural networks, and deep learning, thereby nurturing more profound emotive connections between humans and machines. This paper offers an in‐depth look of the most recent developments in emotion classification technologies, with a specific focus on the period from 2019 to 2023. This research addresses the integration of facial and body gesture analysis into emotion detection, emphasising the development of methods such as partial transfer learning, sign‐based measurement systems, and lightweight convolutional neural networks. By analysing the most recent research, the paper not only provides a comprehensive overview of the most advanced methods, but also emphasises their superiority over previous approaches in terms of efficiency, and applicability. The integration of gestures into emotion analysis is underscored as a critical area for further investigation, providing novel opportunities for comprehending and interpreting the intricate layers of nonverbal communication. The objective of this survey is to provide a resource for researchers and practitioners who are interested in advancing the field of emotion classification by utilising innovative methodologies.