Abstract

Brain-computer interfaces (BCIs) is promising in interacting with machines through electroencephalogram (EEG) signal. The compact end-to-end neural network model for generalized BCIs, EEGNet, has been implemented in hardware to get near sensor intelligence, but without enough efficiency. To utilize EEGNet in low-power wearable device for long-term use, this paper proposes an efficient EEGNet inference accelerator. Firstly, the EEGNet model is compressed by embedded channel selection, normalization merging, and product quantization. The customized accelerator based on the compressed model is then designed. The multilayer convolutions are achieved by reusing multiplying-accumulators and processing elements (PEs) to minimize area of logic circuits, and the weights and intermediate results are quantized to minimize memory sizes. The PEs are clock-gated to save power. Experimental results in FPGA on three datasets show the good generalizing ability of the proposed design across three BCI diagrams, which only consumes 3.31% area and 1.35% power compared to the one-to-one parallel design. The speedup factors of 1.4, 3.5, and 3.7 are achieved by embedded channel selection with negligible loss of accuracy (-0.80%). The presented accelerator is also synthesized in 65nm CMOS low power (LP) process and consumes 0.23M gates, 24.4ms/inference, 0.267mJ/inference, which is 87.22% more efficient than the implementation of EEGNet in a RISC-V MCU realized in 40nm CMOS LP process in terms of area, and 20.77% more efficient in terms of energy efficiency on BCIC-IV-2a dataset.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.