With the increasing amounts of existing unorganized images on the internet today and the necessity to use them efficiently in various types of applications. There is a critical need to discover rigid models that can classify and predict images successfully and instantaneously. Therefore, this study aims to collect Arabic manuscripts images in a dataset and predict their handwriting styles using the most powerful and trending technologies. There are many types of Arabic handwriting styles, including Al-Reqaa, Al-Nask, Al-Thulth, Al-Kufi, Al-Hur, Al-Diwani, Al-Farsi, Al-Ejaza, Al-Maghrabi, Al-Taqraa, etc. However, the study classified the collected dataset images according to the handwriting styles and focused on only six types of handwriting styles that existed in the collected Arabic manuscripts. To reach our goal, we applied the MobileNet pre-trained deep learning model on our classified dataset images to automatically capture and extract the features from them. Afterward, we evaluated the performance of the developed model by computing its recorded evaluation metrics. We reached that MobileNet convolutional neural network is a promising technology since it reached 0.9583 as the highest recorded accuracy and 0.9633 as the average F-score