Individuals facing verbal communication impairments resulting from brain disorders like paralysis or autism encounter significant challenges when unable to articulate speech. This research proposes the design and development of a wearable system capable of decoding imagined speech using electroencephalogram (EEG) signals obtained during the mental process of speech generation. The system’s main objective is to offer an alternative communication method for individuals who can hear and think but face challenges in articulating their thoughts verbally. The design suggested includes user-friendliness, wearability, and comfort for seamless integration into daily life. A minimal number of electrodes are strategically placed on the scalp to minimize invasiveness. Achieving precise localization of the cortical areas responsible for generating the EEG patterns during imagined speech is vital for accurate decoding. Literature studies are utilized to determine the cortical positions associated with speech processing. Due to the inherent limitations in EEG spatial resolution, meticulous experiments are conducted to map the scalp positions onto their corresponding cortical counterparts. Specifically, we focus on identifying the scalp location over the superior temporal gyrus (T3) using the internationally recognized 10-20 electrode placement system by employing a circular periphery movement with a 2 cm distance increment. Our research involves nine subjects spanning various age groups, with the youngest being 23 and the oldest 65. Each participant undergoes ten iterations, during which they imagine six Marathi syllables. Our work contributes to the development of wearable assistive technology, enabling mute individuals to communicate effectively by translating their imagined speech into actionable commands. This innovation ultimately enhances their social participation and overall well-being.
Read full abstract