We apply a multitrace model of memory to explain performance in the artificial grammar task. The model blends the convolution method for representation from Jones and Mewhort's BEAGLE model (Jones, M. N., & Mewhort, D. J. K. (2007). Representing word meaning and order information in a composite holographic lexicon. Psychological Review, 114, 1-37) of semantic memory with the multitrace storage and retrieval model from Hintzman's MINERVA 2 model (Hintzman, D. L. (1986). "Schema abstraction" in a multiple-trace memory model. Psychological Review, 93, 411-428) of episodic memory. We report an artificial grammar experiment, and we fit the model to those data at the level of individual items. We argue that performance in the artificial grammar task is best understood as a process of retrospective inference from memory.
Read full abstract