Deep learning, a specialized branch of machine learning, uses complex neural networks for advanced data processing tasks in which feature extraction and progression require more than one layer of processing using a Neural Network. The Brain Computer Interface is a direct communication link between the brain's electrical activity and an external equipment. The actual brain data is gathered using the BCI technology with electrodes fitted in Brain cells and the research focuses on the recognition of multimodal emotions after feature extraction and multilayer processing with an Artificial Neural Network (ANN). Several organizations have lately released enormous datasets containing experimental data on people's physiological signals (EEGs, eye movements) while they experience different emotions. These datasets collections are being exclusively designed to encourage the development of successful deep-learning emotion identification algorithms where the volunteers had to do some set of tasks regarding the research purpose. In this study, we evaluate deep learning approaches more specifically by the Artificial Neural Network (ANN) using the SEED-IV Electroencephalogram (EEG) dataset. In this paper, we briefly represented the implementation of Emotion Recognition using Artificial Neural Network (ANN), which is one of the topics of our research. Furthermore, we explore the operating principles of artificial neural networks, which are our primary model for identifying emotions. Finally, we offer a few proposals for implementing neural networks on EEG inputs. With the ANN and 3D convolutional layers, we finally achieved a 63.425% accuracy rate.