Abstract

How does the human brain code knowledge about the world? While disciplines such as artificial intelligence represent world knowledge based on human language, neurocognitive models of knowledge have been dominated by sensory embodiment, in which knowledge is derived from sensory/motor experience and supported by high-level sensory/motor and association cortices. The neural correlates of an alternative disembodied symbolic system had previously been difficult to establish. A recent line of studies exploring knowledge about visual properties, such as color, in visually deprived individuals converge to provide positive, compelling evidence for non-sensory, language-derived, knowledge representation in dorsal anterior temporal lobe and extended language network, in addition to the sensory-derived representations, leading to a sketch of a dual-coding knowledge neural framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call