Abstract

We apply an exemplar model of memory to explain performance in the artificial grammar task. The model blends the convolution-based method for representation developed in Jones and Mewhort's BEAGLE model of semantic memory (Psychological Review 114:1-37, 2007) with the storage and retrieval assumptions in Hintzman's MINERVA 2 model of episodic memory (Behavior Research Methods, Instruments, and Computers, 16:96-101, 1984). The model captures differences in encoding to fit data from two experiments that document the influence of encoding on implicit learning. We provide code so that researchers can adapt the model and techniques to their own experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call