Abstract

Connectionist models with the backpropagation learning rule are said to exhibit catastrophic interference (or forgetting) with sequential training. Subsequent works showed that interference can be reduced by using orthogonal inputs. This study investigated, with a more rigorous assessment method, whether all orthogonal inputs lead to comparable extent of interference using three coding schemes. The results revealed large differences between the coding schemes. With larger networks, dense inputs led to severer interference compared with sparse inputs. With smaller networks, all the three schemes led to comparable extent of interference. Therefore, this study proved that not all the orthogonal inputs cause the same extent of interference, and that severity of interference depends on the interaction of the input coding scheme and the network size.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call