Abstract

This paper introduces a new hybrid approach for learning systems that builds on the theory of nonextensive statistical mechanics. The proposed learning scheme uses only the sign of the gradient, and combines adaptive stepsize local searches with global search steps that make use of an annealing schedule inspired from nonextensive statistics, as proposed by Tsallis. The performance of the hybrid approach is empirically investigated through simulation in benchmark problems from the UCI Repository of Machine Learning Databases. Preliminary results provide evidence that the synergy of techniques from nonextensive statistics provide neural learning schemes with significant benefits in terms of learning speed and convergence success.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call