Hello,
The only times I have seen things like this were on misconfigured
clusters. MPI_Init would fail due to "missing" libraries on remote nodes
or improperly set environments. The lack of information makes it
impossible to go any further than that, but I am 99.99% certain this is
NOT a LAM issue. You should try to trouble shoot this with the cluster
maintainer, who will be able to provide far more interactive support, as
well as access to system logs.
On Tue, 28 Jun 2005, Dr. Stephan Nickell wrote:
> Thanks gentlemen,
>
> Right lets change the code a little bit:
>
> cat hi.c
> #include <stdio.h>
> #include "mpi.h"
> int main(int argc, char *argv[])
> {
> int numprocs, myid;
> MPI_Status status;
> MPI_Init(&argc,&argv);
> MPI_Comm_size(MPI_COMM_WORLD,&numprocs);
> MPI_Comm_rank(MPI_COMM_WORLD,&myid);
> printf("hiho! from processor %d of %d.\n",myid,numprocs);
> MPI_Finalize();
> }
>
> it works on 32 bit CPUs running LAM 6.5.6/MPI 2 C++/ROMIO - University of Notre
> Dame
>
> in that way:
>
>>> mpirun -np 4 ./hi32
> hiho! from processor 1 of 4.
> hiho! from processor 0 of 4.
> hiho! from processor 2 of 4.
> hiho! from processor 3 of 4.
>
> and on 64 bit CPUs running LAM 7.1.1/MPI 2 C++/ROMIO - Indiana University
>
>>> mpirun -np 4 ./hi64
> -----------------------------------------------------------------------------
> It seems that [at least] one of the processes that was started with
> mpirun did not invoke MPI_INIT before quitting (it is possible that
> more than one process did not invoke MPI_INIT -- mpirun was only
> notified of the first one, which was on node n0).
>
> mpirun can *only* be used with MPI programs (i.e., programs that
> invoke MPI_INIT and MPI_FINALIZE). You can use the "lamexec" program
> to run non-MPI programs over the lambooted nodes.
> -----------------------------------------------------------------------------
>
> Further suggestions?
>
> Thanks in advance!
> Stephan
>
> PS: Sorry for the missing subject!
>
> ---------------------------------------------------------------------
> This message was sent using https://webmail.biochem.mpg.de
> If you encounter any problems please report to rz-linux_at_[hidden]
>
>
>
------------------------------------------------------------
Anthony Ciani (aciani1_at_[hidden])
Computational Condensed Matter Physics
Department of Physics, University of Illinois, Chicago
http://ciani.phy.uic.edu/~tony
------------------------------------------------------------
|