MPI
Want to write programs for a cluster? Here is your chance. MPI implementer and cluster workhorse Jeff Squyres guides you through the nuances of writing MPI programs with a set of outstanding tutorials. Unlike other MPI tutorials, Jeff addresses cluster issues and optimizations. Just dive in, you don't even need a cluster to get started.
- Details
- Written by Jeff Squyres
- Hits: 14225
When the grass grows in parallel, do they know about each other? Do they coordinate? Perhaps they have some kind of collective intelligence? Do they use MPI?
In previous editions of this column, we've talked about the 6 basic functions of MPI, how MPI_INIT and MPI_FINALIZE actually work, and discussed in agonizing detail the differences between MPI ranks, MPI processes, and CPU processors. Armed with this knowledge, you can write large, sophisticated parallel programs. So what's next?
Collective communication is a next logical step - MPI's native ability to involve a group of MPI processes together in a single communication, possibly involving some intermediate computation.
- Details
- Written by Jeff Squyres
- Hits: 20360
Behind the scenes at MPI studios
In the previous two installments, we covered the basics and fundamentals: what MPI is, some simple MPI example programs, and how to compile and run them. For this column, we will detail what happens in MPI_INIT in a simple MPI application (the "ping-pong" example program in Listing 1).
- Details
- Written by Jeff Squyres
- Hits: 41138
Proper process processing and placement - say that three times
In my previous columnm, I covered the basics and fundamentals: what MPI is, some a simple MPI example program, and how to compile and run the program. In this installment, let's dive into a common terminology misconception: processes vs. processors - they're not necessarily related!
In this context, a processor typically refers to a CPU. Typical cluster configurations utilize uniprocessors or small Symmetric Multi Processor (SMP) nodes (e.g., 2-4 CPUs each). Hence, "processor" has a physical - and finite - meaning. Dual core CPUs can be treated like SMP nodes.
If you recall, I previously said that MPI is described mostly in terms of "MPI processes," where the exact definition of "MPI process" is up to the implementation (it is usually a process or a thread). An MPI application is composed of one or more MPI processes. It is up to the MPI implementation to map MPI processes onto processors.
- Details
- Written by Jeff Squyres
- Hits: 27100
What is MPI, really? Who should use it, and how? Learn the fundamentals from an open-source MPI implementor.