Abstract

In a previous study, a new adaptive method (AM) was developed to adjust the learning rate in artificial neural networks: the generalized no-decrease adaptive method (GNDAM). The GNDAM is fundamentally different from other traditional AMs. Instead of using the derivative sign of a given weight to adjust its learning rate, this AM is based on a trial and error heuristic where global learning rates are adjusted according to the error rates produced by two identical networks using different learning rates. This AM was developed to solve a particular task: the orientation detection of an image defined by texture (the texture task). This new task is also fundamentally different from other traditional ones since its data set is infinite, each pattern is a template used to generate stimuli that the network learns to classify. In the previous study, the GNDAM showed its strength over standard backpropagation for this particular task. The present study compares this new AM to other traditional AMs on the texture task and other benchmark tasks. The results showed that some AMs work well for some tasks while others work better for other tasks. However, all of them failed to achieve a good performance on all tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.