Abstract

Neural network models, celebrated for their outstanding scalability and computational capabilities, have demonstrated remarkable performance across various fields such as vision, language, and multimodality. The rapid advancements in neural networks, fueled by the deep development of Internet technology and the increasing demand for intelligent edge devices, introduce new challenges, including significant model parameter sizes and increased storage pressures. In this context, Field-Programmable Gate Arrays (FPGA) emerge as a preferred platform for accelerating neural network models, thanks to their exceptional performance, energy efficiency, and the flexibility and scalability of the system. Building FPGA-based neural network systems necessitates bridging significant differences in objectives, methods, and design spaces between model design and hardware design. This review article adopts a comprehensive analytical framework to thoroughly explore multidimensional technological implementation strategies, encompassing optimizations at the algorithmic and hardware levels, as well as compiler optimization techniques. It focuses on methods for collaborative optimization between algorithms and hardware, identifies challenges in the collaborative design process, and proposes corresponding implementation strategies and key steps. Addressing various technological dimensions, the article provides in-depth technical analysis and discussion, aiming to offer valuable insights for research on optimizing and accelerating neural network models in edge computing environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.