As the field of fault diagnosis in electrical machines has significantly attracted the interest of the research community in recent years, several methods have arisen in the literature. Also, raw data signals can be acquired easily nowadays, and, thus, machine learning (ML) and deep learning (DL) are candidate tools for effective diagnosis. At the same time, a challenging task is to identify the presence and type of a bearing fault under noisy conditions, especially when relevant faults are at their incipient stage. Since, in real-world applications and especially in industrial processes, electrical machines operate in constantly noisy environments, a key to an effective approach lies in the preprocessing stage adopted. In this work, an evaluation study is conducted to find the most suitable signal preprocessing techniques and the most effective model for fault diagnosis of 16 conditions/classes, from a low-workload (computational burden) perspective using a well-known dataset. More specifically, the reliability and resiliency of conventional ML and DL models is investigated here, towards rolling bearing fault detection, simulating data that correspond to noisy industrial environments. Diverse preprocessing methods are applied in order to study the performance of different training methods from the feature extraction perspective. These feature extraction methods include statistical features in time-domain analysis (TDA); wavelet packet decomposition (WPD); continuous wavelet transform (CWT); and signal-to-image conversion (SIC), utilizing raw vibration signals acquired under varying load conditions. The noise effect is examined and thoroughly commented on. Finally, the paper provides accumulated usual practices in the sense of preferred preprocessing methods and training models under different load and noise conditions.