Force myography (FMG) can detect changes in the muscle volume which can be interpreted to recognize human intention. FMG data, however, is highly dependent on the placement of sensors over muscles, and interpretation becomes challenging if the sensors get displaced. This paper presents a robust sensor over muscle independence (SOMI) preprocessing algorithm combined with lightweight deep neural network (DNN) which shows high classification accuracy of the FMG data. SOMI organizes the irregular sensory data into regular patterns. Proposed algorithm makes the DNN insensitive to not only position and rotation shift but also to the flip of the sensors arrangement. A custom designed FMG band is used for payload recognition to experimentally validate the proposed method with five payload statuses and eight subjects. A five-fold cross validation comparative study demonstrated that the proposed method is 19.8% and 7.1% more accurate than support vector machine (SVM) and DNN without SOMI, respectively, and showed superior performance against k-nearest neighbors (KNN) and decision tree (DT). SOMI empowered a lightweight DNN to maintain the accuracy over 98% for different arbitrary wearing schemes of the FMG band over both left and right upper arms.