Abstract

The performance and behavior of many machine learning algorithms are stochastic because they explicitly use randomness in their procedures. Stochastic refers to a process where the outcome involves some randomness and has some uncertainty. The stochastic nature of machine learning algorithms is an essential foundational concept in machine learning and must be understood to effectively interpret the behavior of predictive models. These algorithms are practical for training and optimizing large systems with rich structures. Such algorithms have been deployed with considerable success in large-scale hydrological models. This contribution presents an overview of theoretical and practical aspects of the broad stochastic learning algorithms that can be categorized as stochastic Gradient Boosting and stochastic Gradient Descent and describe their common properties. These include well-known algorithms such as K-Means, Perceptron, Adaline, LVQ, and Multi-Layer Networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call