Abstract

To what extent does language shape how we think about the world? Studies suggest that linguistic symbols expressing conceptual categories (‘apple’, ‘squirrel’) make us focus on categorical information (e.g. that you saw a squirrel) and disregard individual information (e.g. whether that squirrel had a long or short tail). Across two experiments with preverbal infants, we demonstrated that it is not language but nonverbal category knowledge that determines what information is packed into object representations. Twelve-month-olds (N = 48) participated in an electroencephalography (EEG) change-detection task involving objects undergoing a brief occlusion. When viewing objects from unfamiliar categories, infants detected both across- and within-category changes, as evidenced by their negative central wave (Nc) event-related potential. Conversely, when viewing objects from familiar categories, they did not respond to within-category changes, which indicates that nonverbal category knowledge interfered with the representation of individual surface features necessary to detect such changes. Furthermore, distinct patterns of γ and α oscillations between familiar and unfamiliar categories were evident before and during occlusion, suggesting that categorization had an influence on the format of recruited object representations. Thus, we show that nonverbal category knowledge has rapid and enduring effects on object representation and discuss their functional significance for generic knowledge acquisition in the absence of language.

Highlights

  • To what extent does language shape how we think about the world? Studies suggest that linguistic symbols expressing conceptual categories (‘apple’, ‘squirrel’) make us focus on categorical information and disregard individual information

  • Follow-up t-tests revealed that infants who viewed familiar categories noted across-category object changes, as evidenced by a more negative negative central wave (Nc) response on across-category change trials than no change trials, t11 = 5.552, p < 0.001, 95% confidence intervals (CI) = [3.59, 8.31], d = 1.60, but they failed to display sensitivity to within-category object changes, as shown by the comparable responses on within-category change and no change trials, t11 = 0.597, p = 0.562, 95% CI = [−1.27, 2.21], d = 0.17

  • Compared with no change trials, their Nc was significantly more negative on both within-category change trials, t11 = 3.577, p = 0.004, 95% CI = [1.63, 6.83], d = 1.03, and across-category change trials, t11 = 3.260, p = 0.008, 95% CI = [1.33, 6.84], d = 0.94

Read more

Summary

Participants

The EEG task of an identical structure as in Experiment 1 was directly preceded by a short behavioural procedure: category training or control (using the same visual stimuli and matched for exposure time, but not conducive to category learning), administered in a different experimental room than the EEG task Because it is unknown whether infants at this age can readily learn more than two categories in a laboratory setting, we limited the number of taught categories to two. One trial lasted 15 s, for a total duration of the session of approximately 2 min This training procedure, modelled on previous work in infant category learning [60], was validated in a behavioural pilot using a looking-time violation-ofexpectation test and administered to a separate group of participants This condition was designed to control for the effects of prolonged perceptual exposure to unfamiliar objects

Stimuli
Design and procedure
EEG data acquisition and preprocessing
Derivation of event-related potentials
Results and discussion
Experiment 2
Data analysis
Comparison of Experiments 1 and 2
Time course and nature of categorical biases on object representation
Time–frequency analysis
General discussion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.