开发者

Open MPI program not working when distributing processes among multiple hosts

My test program works fine when I run multiple processes on a single machine.

$ ./mpirun -np 2 ./mpi-test
Hi I'm A:0
Hi I'm A:1
A:1 sending 11...
A:1 sent 11
A:0 received 11 from 1
all workers checked in!

When I run the same program on multiple hosts the process is spawned on each host, but MPI_Send never returns.

$ ./mpirun -np 2 -host A,B ./mpi-test
Hi I'm A:0
Hi I'm B:1
B:1 sending 11...

I've tried a couple other sample MPI programs I found and I ran into the same problem. Any idea what is going wrong?

EDIT: this also runs on a remote machine if all the processes are spawned on that machine.

Code:

#include <mpi.h>

int main(int argc, char** argv)
{
    MPI::Init();
    int rank = MPI::COMM_WORLD.Get_rank();
    int size = MPI::COMM_WORLD.Get_size();
    char name[256];
    int len;
    MPI::Get_processor_name(name, len);

    printf("Hi I'm %s:%d\n", name, rank);

    if (rank == 0) {
        while (size > 1) {
            int val;
            MPI::Status status;
            MPI::COMM_WORLD.Recv(&val, 1, MPI::INT, MPI::ANY_SOUR开发者_如何学运维CE, MPI::ANY_TAG, status);
            int source = status.Get_source();
            printf("%s:0 received %d from %d\n", name, val, source);
            size--;
        }
        printf("all workers checked in!\n");
    }
    else {
        int val = rank + 10;
        printf("%s:%d sending %d...\n", name, rank, val);
        MPI::COMM_WORLD.Send(&val, 1, MPI::INT, 0, 0);
        printf("%s:%d sent %d\n", name, rank, val);
    }
    MPI::Finalize();

    return 0;
}

EDIT: ompi_info

$ ./mpirun --bynode -host A,B --tag-output ompi_info -v ompi full --parsable
[1,0]<stdout>:package:Open MPI user@A Distribution
[1,0]<stdout>:ompi:version:full:1.4.3
[1,0]<stdout>:ompi:version:svn:r23834
[1,0]<stdout>:ompi:version:release_date:Oct 05, 2010
[1,0]<stdout>:orte:version:full:1.4.3
[1,0]<stdout>:orte:version:svn:r23834
[1,0]<stdout>:orte:version:release_date:Oct 05, 2010
[1,0]<stdout>:opal:version:full:1.4.3
[1,0]<stdout>:opal:version:svn:r23834
[1,0]<stdout>:opal:version:release_date:Oct 05, 2010
[1,0]<stdout>:ident:1.4.3
[1,1]<stdout>:package:Open MPI user@B Distribution
[1,1]<stdout>:ompi:version:full:1.4.3
[1,1]<stdout>:ompi:version:svn:r23834
[1,1]<stdout>:ompi:version:release_date:Oct 05, 2010
[1,1]<stdout>:orte:version:full:1.4.3
[1,1]<stdout>:orte:version:svn:r23834
[1,1]<stdout>:orte:version:release_date:Oct 05, 2010
[1,1]<stdout>:opal:version:full:1.4.3
[1,1]<stdout>:opal:version:svn:r23834
[1,1]<stdout>:opal:version:release_date:Oct 05, 2010
[1,1]<stdout>:ident:1.4.3


I ended up upgrading to 1.5.3 on A and installing 1.5.3 on C. I'm not sure whether it was the upgrade, or an issue with B, but everything is working now.

For reference:

  • original setup: node A (arch linux, Open MPI 1.4.3), node B (ubuntu, Open MPI 1.4.3)
  • working setup: node A (arch linux, Open MPI 1.5.3), node C (arch linux, Open MPI 1.5.3)


The usual reason for this is that something is not set up properly on the remote host; it could be login/network problems, or that the MPI libraries/executables or the program itself isn't found on the remote host.

What happens if you try

mpirun -np 2 -host A,B  hostname

?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜