All living things, comprising animals, plants, and people require water to survive. The world is covered in water, just 1 percent of it is fresh and functional. The importance and value of freshwater have increased due to population growth and rising water demands. Approximately more than 70 percent of the world's freshwater is used for agriculture. Agricultural employees are the least productive, inefficient, and heavily subsidized water users in the world. They also utilize the most water overall. Irrigation consumes a considerable amount of water. The field's water supply needs to be safeguarded. A critical stage in estimating agricultural production is crop irrigation. The global shortage of fresh water is a serious issue, and it will only get worse in the years to come. Precision agriculture and intelligent irrigation are the only solutions that will solve the aforementioned issues. Smart irrigation systems and other modern technologies must be used to improve the quantity of high-quality water used for agricultural irrigation. Such a system has the potential to be quite accurate, but it requires data about the climate and water quality of the region where it will be used. This study examines the smart irrigation system using the Internet of Things (IoT) and cloud-based architecture. The water's temperature, pH, total dissolved solids (TDS), and turbidity are all measured by this device before the data is processed in a cloud using the range of machine learning (ML) approaches. Regarding water content limits, farmers are given accurate information. Farmers can increase production and water quality by using effective irrigation techniques. ML methods comprising support vector machines (SVM), random forests (RF), linear regression, Naive Bayes, and decision trees (DT) are used to categorize pre-processed data sets. Performance metrics like accuracy, precision, recall, and f1-score are used to calculate the performance of ML algorithms.