Human Computer Interaction (HCI) systems have become an increasingly essential aspect of our everyday lives as the importance of computerized devices in society grows. The efficient use of the available information flow of computer, communication, and display technologies is determined by HCI. In recent years, there has been a lot of buzz about developing intuitive interfaces that can detect and convert the user's body motions into computer instructions. Various biomedical signals (biosignals) that may be obtained from a specific tissue, organ, or cell system like the nervous system can be utilized for neural connection with computers. Electroencephalogram (EEG), Electrooculogram (EOG), and Electromyogram are some examples (EMG). Physically handicapped people benefit greatly from such methods. Many efforts have been made to create HCI using EMG signals from gesture. The development of continuous EMG signal categorization for graphical controller, which allows the physically handicapped to utilize word processing applications and other personal computer software, as well as the internet, is presently underway in the field of EMG signal processing and controller. It also allows for the manipulation of robotic devices, prosthetic limbs, I/O for virtual reality games, and fitness equipment, among other things. The majority of developmental research is focused on the use of neural networks to recognize patterns. The EMG controller may be configured to recognize gestures via signal analysis of muscle action potential groups. The aim of this review article is to examine the different methods and algorithms used for EMG signal categorization in order to convert EMG signals into computer commands.