Abstract

AbstractWord embedding is a technique for representing words as vectors in a way that captures their semantic and syntactic relationships. The processing time of one of the most popular word embedding technique Word2vec is very large due to the huge data size. We evaluate the performance of a power-efficient FPGA-based accelerator designed using OpenCL. We achieved up to 18.7 times speed-up compared to single-core CPU implementation with the same accuracy. The proposed accelerator consumes less than 83 W of power and it is the most power-efficient one compared to many top-end CPU and GPU-based accelerators.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call