Abstract
The unpredictable nature of irregular memory accesses in a mixed memory applications such as deep learning application poses many challenges due to the communication issues. Typically, a multi-GPU node that has a large number of simultaneous memory requests consumes almost 80% of the processing time for memory mapping. This calls for characterization of mixed regular and irregular memory accesses so that memory divergence can be simplified to improve performance. In this paper, using large deviations principle, it is shown that the mixed regular and irregular memory accesses can be viewed as a combination of continuous and discrete functions. This view point is proved to give better performance through characterization of memory divergence in multi-GPU node using the sub-additivity property. Further, a detection test procedure based on quenched large deviations model is proposed which generates threshold values for optimizing the memory mapping in data intensive applications and hence it will improve the performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.