Human emotion recognition using machine learning is a new field that has the potential to improve user experience, lower crime, and target advertising. The ability of today's emotion detection systems to identify human emotions is essential. Applications ranging from security cameras to emotion detection are readily accessible. Machine learning-based emotion detection recognises and deciphers human emotions from text and visual data. In this study, we use convolutional neural networks and natural language processing approaches to create and assess models for emotion detection. Instead of speaking clearly, these human face expressions visually communicate a lot of information. Recognising facial expressions is important for human-machine interaction. Applications for automatic facial expression recognition systems are numerous and include, but are not limited to, comprehending human conduct, identifying mental health issues, and creating artificial human emotions. It is still difficult for computers to recognise facial expressions with a high recognition rate. Geometry and appearance-based methods are two widely used approaches for automatic FER systems in the literature. Pre-processing, face detection, feature extraction, and expression classification are the four steps that typically make up facial expression recognition. The goal of this research is to recognise the seven main human emotions anger, disgust, fear, happiness, sadness, surprise, and neutrality using a variety of deep learning techniques (convolutional neural networks).