This research aims to provide the groundwork for smartly categorizing hand movements for use with prosthetic hands. The hand motions are classified using surface electromyography (sEMG) data. In reaction to a predetermined sequence of fibre activation, every single one of our muscles contracts. They could be useful in developing control protocols for bio-control systems, such human-computer interaction and upper limb prostheses. When focusing on hand gestures, data gloves and vision-based approaches are often used. The data glove technique requires tedious and unnatural user engagement, whereas the vision-based solution requires significantly more expensive sensors. This research offered a Deep Neural Network (DNN) automated hand gesticulation recognition system based on electromyography to circumvent these restrictions. This work primarily aims to augment the concert of the hand gesture recognition system via the use of an artificial classifier. To advance the recognition system's classification accuracy, this study explains how to build models of neural networks and how to use signal processing methods. By locating the Hilbert Huang Transform (HHT), one may get the essential properties of the signal. When training a DNN classifier, these characteristics are sent into it. The investigational results reveal that the suggested technique accomplishes a better categorization rate (98.5 % vs. the alternatives).