Abstract

We propose a technique for first‐order gradient‐based optimization of stochastic objective functions called Nesterov‐accelerated adaptive moment assessment, which makes use of dynamic evaluations of lower‐order moments. The adaptive moment assessment and the Nesterov acceleration gradient are combined. Consequently, it has perks, and this technique is convenient to use, numerically economical, memory‐light, and very well‐suited for challenges with massive amounts of information and characteristics. Additionally, we investigate the algorithm's convergence characteristics and propose a conservative constraint on the convergence rate. Finally, we employ this technique for the detection and classification of safety helmets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call