Abstract

The authors explore a learning process of recurrent neural networks in the learning surface where learning is executed. Computer simulations show that the learning, which is the process of searching for optimal adjustable parameters, is gently descending on the steepest gradient forward along the bottom of a curved valley. This also means that the learning surface has a specific shape. These characteristics in learning are basically consistent with those of the multilayer neural networks analyzed by Gouhara et al. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call