Abstract
Mat-Core is a research processor aiming at exploiting the increasingly number of transistors per IC to improve the performance of a wide range of applications. It extends a general-purpose scalar processor with a matrix unit for processing vector/matrix data. The extended matrix unit is decoupled into two components to hide memory latency: address generation and data computation, which communicate through data queues. This paper investigates the scalability of Mat-Core architecture with different number of parallel lanes (one, four, and eight) on some linear algebra kernels. These kernels include scalar-vector multiplication, SAXPY, Givens rotation, rank-1 update, vector-matrix multiplication, and matrix-matrix multiplication. A cycle accurate model of Mat-Core processor is implemented using SystemC (system level modeling language). Four versions of Mat-Core processor are implemented and evaluated to show its scalability. These versions include Mat-Core with single lane and 8-element vector registers, four lanes with 4×4 matrix registers, four lanes with 8×4 matrix registers, and eight lanes with 8×8 matrix registers. The first version (single lane with 8-element vector registers) exploits only scalar and vector ISA whereas the other versions can exploit the three levels of Mat-Core ISA (scalar/vector/matrix ISA). Our results show that increasing the number of parallel lanes from one to four and then to eight speeds up the execution of the six kernels by factors of 3.6×−4.8× and 7.94×−10.6×, respectively, which indicates the scalability of Mat-Core architecture. Moreover, the maximum performance of the Mat-Core processor on matrix-matrix multiplication represents 90% of the ideal value.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.