Abstract
A novel de-noising method based on wavelet transform is presented based on the good localization characteristic of the wavelet transform both in time and frequency domain, which can make happen the extraction of visual evoked potentials in single training sample from the EEG background noise in favor of studying the changes between the single sample response. The information is probably related with the different function, appearance and pathologies of the brain. At the same time this method can also be used to remove those signal's artifacts that do not appear with EP within the same scope of time or frequency. The traditional Fourier filter can hardly attain the similar result. This method is different from other wavelet de-noising methods in that different criteria are employed in choosing wavelet coefficient. It has a biggest virtue of noting the differences among the single training sample and making use of the characteristics of being high time frequency resolution to reduce the effect of interference factors to a maximum extent within the time scope that EP appear. The experiment result proves that this method is not restricted by the signal-to-noise ratio of evoked potential and electroencephalograph and even can recognize instantaneous event under the condition of lower signal-to-noise ratio, as well as recognize more easily the samples, which evoked evident response. In addition, averaging methodology can dramatically reduce the number of record samples needed, thus avoiding the effect of behavior change during the recording process. This methodology pays attention to the differences among single training sample and also accomplishes the extraction of visual evoked potentials from single trainings sample. As a result, system speed and accuracy could be improved to a great extent if this methodology is applied to brain-computer interface system based on evoked responses.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.