Abstract

The existence of implicit (unconscious) learning has been demonstrated in several laboratory paradigms. Researchers have also suggested that it plays a role in complex real-life human activities. For instance, in social situations, we may follow unconscious behaviour scripts or intuitively anticipate the reaction of familiar persons based on nonconscious cues. Still, it is difficult to make inferences about the involvement of implicit learning in realistic contexts, given that this phenomenon has been demonstrated, almost exclusively, using simple artificial stimuli (e.g., learning structured patterns of letters). In addition, recent analyses show that the amount of unconscious knowledge learned in these tasks has been overestimated by random measurement error. To overcome these limitations, we adapted the artificial grammar learning (AGL) task, and exposed participants (N = 93), in virtual reality, to a realistic agent that executed combinations of boxing punches. Unknown to participants, the combinations were structured by a complex artificial grammar. In a subsequent test phase, participants accurately discriminated novel grammatical from nongrammatical combinations, showing they had acquired the grammar. For measuring awareness, we used trial-by-trial subjective scales, and an analytical method that accounts for the possible overestimation of unconscious knowledge due to regression to the mean. These methods conjointly showed strong evidence for implicit and for explicit learning. The present study is the first to show that humans can implicitly learn, in VR, knowledge regarding realistic body movements, and, further, that implicit knowledge extracted in AGL is robust when accounting for its possible inflation by random measurement error.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call