Quantcast
Channel: Angel \"Java\" Lopez on Blog » High Performance Computing
Viewing all articles
Browse latest Browse all 5

First steps with MPI.NET programming

$
0
0

These days, I’m exploring MPI programming, using the MPI.NET wrapper implementation from Microsoft, that can be used on Windows HPC Server 2008.

MPI stands for Message Passing Interface, an API that supports the program of parallel programs. A MPI application can be run in many instances, called ranks, and each instance can receive and send message from and to others. The API can be consumed from languages like C or Fortran. MPI.NET is a wrapper that eases the writing of MPI programs in .NET.

You don’t need a cluster to run an MPI executable. Each program can be tested locally, launching many ranks as processes in your local machine.

Some days ago, I wrote some sample code, to test my understanding of MPI.NET. You can download the source from my Skydrive in MpiNetFirstExamples.zip.

If you want to try other way, months ago I posted about another .NET implementation:

MPI Message Passing Interface in .NET

MPI

MPI (Message Passing Interface) is supported by Windows HPC. There is a Microsoft implementation:

Microsoft MPI (Windows)

that can be invoked from C++.

There is a .NET implementation over Microsoft MPI:

MPI.NET: High-Performance C# Library for Message Passing

It has source code and examples.

To these examples, I installed the HPC Pack I downloaded from:

HPC Pack 2008 SDK download

and then, I installed MPI.NET Software

(I installed MPI.NET SDK.msi but I expanded MPI.NET-1.0.0.zip too: it has better examples, with VS solutions)

When you install HPC Pack 2008 SDK, you get new programs:

And for MPI.NET:

If you expand the additional MPI.NET-1.0.0.zip you get a folder with more examples and documentation:

More about MPI in general:

MPI 2.0 Report
MPI Tutorials
Microsoft Messaging Passing Interface – Wikipedia, the free encyclopedia
Pure Mpi.NET

Hello World

As usual, a “Hello, World” MPI application is the first app to try. My solution looks:

The program.cs source is simple:

using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace MpiNetHelloWorld { class Program { static void Main(string[] args) { using (new MPI.Environment(ref args)) { Console.WriteLine("I'm {0} of {1}", MPI.Communicator.world.Rank, MPI.Communicator.world.Size); } } } }

Note the use of ref args inthe initialization of MPI.Environment. MPI receives additional dedicated arguments, so it has to process them and remove from the rest of arguments.

You can run alone, obtaining:

Not very impressive…. ;-)

You can invoke a command line using mpiexec:

mpiexec -n 8 MpiNetHelloWorld.exe

then the output is:

There are 8 ranks (instances) running, in the same machine. If you have a cluster with MPI support (as Windows HPC Server 2008) you could run the program in all the nodes of the cluster.

Ringing the nodes

In the previous example, no communication between nodes occured. A classic example is to send messages in a ring, from rank 0 to 1 to 2, and at end, back to 0. This is my solution:

The program.cs code is:

class Program { static void Main(string[] args) { using (MPI.Environment environment = new MPI.Environment(ref args)) { Intracommunicator comm = MPI.Communicator.world; if (comm.Size < 2) { Console.WriteLine("At least two processes are needed"); return; } Console.WriteLine("I'm {0} of {1}", MPI.Communicator.world.Rank, MPI.Communicator.world.Size); if (comm.Rank == 0) // It's the root { string sendmessage = string.Format("Hello from {0} to {1}", comm.Rank, comm.Rank + 1); comm.Send(sendmessage, comm.Rank + 1, 0); string recmessage; comm.Receive<string>(comm.Size - 1, 0, out recmessage); Console.WriteLine("Received: {0}", recmessage); } else { string recmessage; comm.Receive<string>(comm.Rank - 1, 0, out recmessage); Console.WriteLine("Received: {0}", recmessage); string sendmessage = string.Format("Hello from {0} to {1}", comm.Rank, (comm.Rank + 1) % comm.Size); comm.Send(sendmessage, (comm.Rank + 1) % comm.Size, 0); } } } }

Scattering messages

There is another example (MpiNetScatter solution), where an array of integers is scattered to all ranks, from the rank 0:

class Program { static void Main(string[] args) { using (MPI.Environment environment = new MPI.Environment(ref args)) { Intracommunicator comm = Communicator.world; if (comm.Rank == 0) { int [] numbers = new int[comm.Size]; for (int k = 0; k < numbers.Length; k++) numbers[k] = k * k; int r = comm.Scatter(numbers); Console.WriteLine("Received {0} at {1}", r, comm.Rank); } else { int r = comm.Scatter<int>(0); Console.WriteLine("Received {0} at {1}", r, comm.Rank); } } } }

Threads and MPI

We can improve the ring examples, using new features from MPI2, supported by Microsoft implementation: sending and receiving messages using multiple threads. The solution is MpiNetMultiThreadRing. The code:

class Program { static void Main(string[] args) { using (MPI.Environment environment = new MPI.Environment(ref args)) { Intracommunicator comm = MPI.Communicator.world; if (comm.Size < 2) { Console.WriteLine("At least two processes are needed"); return; } MultiComm multicomm = new MultiComm(MPI.Communicator.world); Thread thread = new Thread(new ThreadStart(multicomm.Run)); thread.Start(); MultiComm multicomm2 = new MultiComm(MPI.Communicator.world); Thread thread2 = new Thread(new ThreadStart(multicomm2.Run)); thread2.Start(); thread.Join(); thread2.Join(); } } }

I wrote a helper class, MultiComm, that has methods to send and receive message. It uses a lock: MPI implementation doesn’t support the use of MPI commands from more than one thread simultaneously. So, I have to synchronize the methods that access MPI from different threads. It’s a shame, but it is what is supported.

Conclusion

MPI implies a new way of thinking applications. There is no easy path to MPIfied an algorithm or application. I should play with async message passing: in the above examples, when an instance sends a message, the other party should be listen to receive the message. Dispite its idiosyncrasy, MPI is an interesting field to explore, with a wide community with interesting applications.

Angel “Java” Lopez
http://www.ajlopez.com/en
http://twitter.com/ajlopez



Viewing all articles
Browse latest Browse all 5

Trending Articles