Hello,
If you configured lam with a c2c rpi (usysv, sysv), the spawned process
will use c2c by default. You can use --with-rpi=sysv or
--with-rpi=usysv as a configure option to make one of the c2c rpi's as
default rpi.
By default, the -0 option is not set. If you cannot upgrade to LAM-7.02,
and if you are going to run your jobs on *homogenous* machines only, we
could indicate to you where you could make changes to the LAM source so
that -0 would no longer be required. Let us know if this is the case.
Hope this helps. Thanks.
--
Prashanth
pcharapa_at_[hidden]
LAM-MPI Team
Thus spoke Troy Klein in the message sent on Tue, 21 Oct 2003
->Unfortunately, I'm using LAM 6.5.9. Is there anyway to enable the -c2c and
->-O functionalities outside of mpirun under LAM 6.5.9? I'm using
->MPI_Comm_spawn to start the multicomputer therefore mpirun is not useful.
->
->Thanks, Troy.
->
->-----Original Message-----
->From: Prashanth [mailto:pcharapa_at_[hidden]]
->Sent: Monday, October 20, 2003 4:44 PM
->To: General LAM/MPI mailing list
->Subject: Re: LAM: MPI_Comm_spawn and c2c
->
->
->
->Hello,
->
->Apologies for the late reply.
->
->What version of LAM are you using? Starting with LAM-7.0, users do not have
->to specify the '-0' flag to mpirun for optimization as this is taken careof
->automatically by LAM. Also, you can select an RPI by setting the
->Environment variable LAM_MPI_SSI_RPI to a valid RPI name (tcp, sysv, usysv,
->etc)
->
->Please see the LAM-7.02 User's Guide for more information on SSI parameters.
->Hope this helps.
->
->--
->Prashanth
->pcharapa_at_[hidden]
->LAM-MPI Team
->
->
->Thus spoke Troy Klein in the message sent on Wed, 15 Oct 2003
->
->->I am using MPI_Comm_spawn in a MATLAB MEX file to spawn workers across
->->a Beowulf cluster. How can I optimize the communication in the way
->->that the -c2c and -O options of mpirun usually do? These options are
->->not available to me because I am not using mpirun.
->->
->->Thanks, Troy.
->->
->->_______________________________________________
->->This list is archived at http://www.lam-mpi.org/MailArchives/lam/
->->
->
->
->_______________________________________________
->This list is archived at http://www.lam-mpi.org/MailArchives/lam/
->
|