The object of the study is effective well cleaning during drilling. The subject of the study is the development of a machine learning model based on a neural network for predicting the optimal minimum drilling fluid flow rate. The challenge is the need to improve well cleaning efficiency to prevent stuck pipes and the associated downtime and costs. During the study, a neural network model was developed and tested to predict the minimum flow rate for cleaning wells. The model was trained and tested on data showing its high accuracy and reliability. The mean square error (MSE) reached 0.019169 for LSTM and 0.0828 for GRU, indicating the accuracy of the predictions. Neural network architectures such as Long-Short Term Memory (LSTM) and Gated Recurrent Unit (GRU) were used to efficiently process time series of data and consider long-term dependencies. The results are explained using advanced neural network architectures and machine learning algorithms, which made it possible to achieve good accuracy of predictions. These architectures enable efficient model training on large amounts of data, allowing complex dependencies and influencing factors to be considered. Distinctive features of the results include good accuracy of predictions and the ability to use the model in real-world conditions. The model demonstrates good performance and reliability in predicting the minimum flow rate. The results of the study can be used to optimize the processes of well drilling. Practical applications include using the model to predict the optimal minimum flow rate in various conditions, which will reduce the risks of stuck pipes and increase the efficiency of drilling operations. The model can be integrated into existing monitoring and control systems for drilling processes to improve their performance