Abstract
This article introduces a connectionist model of category learning that takes into account the prior knowledge that people bring to new learning situations. In contrast to connectionist learning models that assume a feedforward network and learn by the delta rule or backpropagation, this model, the knowledge-resonance model, or KRES, employs a recurrent network with bidirectional symmetric connection whose weights are updated according to a contrastive Hebbian learning rule. We demonstrate that when prior knowledge is represented in the network, KRES accounts for a considerable range of empirical results regarding the effects of prior knowledge on category learning, including (1) the accelerated learning that occurs in the presence of knowledge, (2) the better learning in the presence of knowledge of category features that are not related to prior knowledge, (3) the reinterpretation of features with ambiguous interpretations in light of error-corrective feedback, and (4) the unlearning of prior knowledge when that knowledge is inappropriate in the context of a particular category.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.