Abstract

The Message Passing Interface (MPI) was developed over eighteen years ago and continues to be the preferred programming model for scientific computing. Contributing to that success was a combination of forward-looking features, precise definition, and judgment based on the experience of developers, vendors and users. Today, MPI continues to adapt to the changing needs of parallel programming, with MPI-3 introducing enhancements for collective and one-sided communication, multi-threaded programming, support of performance tools for MPI programming, etc. However, MPI faces many challenges as the nature of parallel computing changes more radically than at any time in the history of MPI. This talk will touch on some of the less obvious but important reasons for MPI success, discuss some of the challenges that MPI faces, and makes suggestions for future directions in MPI and parallel programming language research.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.