Abstract

Objective. Semantic decoding refers to the identification of semantic concepts from recordings of an individual’s brain activity. It has been previously reported in functional magnetic resonance imaging and electroencephalography. We investigate whether semantic decoding is possible with functional near-infrared spectroscopy (fNIRS). Specifically, we attempt to differentiate between the semantic categories of animals and tools. We also identify suitable mental tasks for potential brain–computer interface (BCI) applications. Approach. We explore the feasibility of a silent naming task, for the first time in fNIRS, and propose three novel intuitive mental tasks based on imagining concepts using three sensory modalities: visual, auditory, and tactile. Participants are asked to visualize an object in their minds, imagine the sounds made by the object, and imagine the feeling of touching the object. A general linear model is used to extract hemodynamic responses that are then classified via logistic regression in a univariate and multivariate manner. Main results. We successfully classify all tasks with mean accuracies of 76.2% for the silent naming task, 80.9% for the visual imagery task, 72.8% for the auditory imagery task, and 70.4% for the tactile imagery task. Furthermore, we show that consistent neural representations of semantic categories exist by applying classifiers across tasks. Significance. These findings show that semantic decoding is possible in fNIRS. The study is the first step toward the use of semantic decoding for intuitive BCI applications for communication.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call