Abstract

The step size selection in stochastic optimization methods is a crucial aspect that holds significant importance in theoretical analysis and practical applications. We propose two stochastic optimization methods based on the competitive Barzilai-Borwein (BB) step size in both the inner and outer loops of the mini-batch semi-stochastic gradient descent (mS2GD) algorithm. The competitive BB step size is automatically updated using the latest and most accurate secant equation. We introduce two algorithms: mS2GD-CBB, which updates the step size in the outer loop, and mS2GD-RCBB, which updates the step size in the inner loop. We evaluate the performance of the proposed algorithms on the classical optimization problem and compare them against existing methods. Experimental results demonstrate that the methods exhibit favorable convergence properties and can effectively handle the challenges of big data arising from the fields of signal processing, statistics, and machine learning. The methods offer enhanced adaptability, flexibility, and exploration capabilities, making them promising approaches for a wide range of optimization problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.