Abstract

The gradient descent algorithm is a type of optimization algorithm that is widely used to solve machine learning algorithm model parameters. Through continuous iteration, it obtains the gradient of the objective function, gradually approaches the optimal solution of the objective function, and finally obtains the minimum loss function and related parameters. The gradient descent algorithm is frequently used in the solution process of logical regression, which is a common binary classification approach. This paper compares and analyzes the differences between batch gradient descent and its derivative algorithms — stochastic gradient descent algorithm and mini- batch gradient descent algorithm in terms of iteration number, loss function through experiments, and provides some suggestions on how to pick the best algorithm for the logistic regression binary task in machine learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call