Abstract
A central claim shared by most recent models of short-term memory (STM) is that item knowledge is coded independently from order in long-term memory (LTM; e.g., the letter A is coded by the same representational unit whether it occurs at the start or end of a sequence). Serial order is computed by dynamically binding these item codes to a separate representation of order. By contrast, Botvinick and Plaut developed a parallel distributed processing (PDP) model of STM that codes for item-order information conjunctively, such that the same letter in different positions is coded differently in LTM. Their model supports a wide range of memory phenomena, and critically, STM is better for lists that include high, as opposed to low, sequential dependencies (e.g., bigram effects). Models with context-independent item representations do not currently account for sequential effects. However, we show that their PDP model is too sensitive to these effects. A modified version of the model does better but still fails in important respects. The successes and failures can be attributed to a fundamental constraint associated with context-dependent representations. We question the viability of conjunctive coding schemes to support STM and take these findings as problematic for the PDP approach to cognition more generally.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.