LAM/MPI logo

LAM/MPI General User's Mailing List Archives

  |   Home   |   Download   |   Documentation   |   FAQ   |   all just in this list

From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2008-02-07 20:10:02


It sounds like lam's mpiexec script is misbehaving somehow -- you
might try mpirun, instead (in LAM, mpiexec is a perl script wrapper
around mpirun).

Or you might try switching to Open MPI. :-)

On Feb 7, 2008, at 8:05 PM, Artur Tyliszczak wrote:

> Hi Todd,
>
> Better switch to OpenMPI as soon as possible.
> I spent couple of days installing and then trying to run LAM 7.1.2 and
> 7.1.4 on the same computer and OS you have.
> I managed to install LAM, then I was able to compile my parallel codes
> BUT there was no way to run them.
> Strange errors were reported and finally I gave up.
>
> Jeff Squyres suggested to go for OpenMPI. I did it. NO MORE PROBLEMS
> and everything works fine.
>
> Regards,
> Artur.
>
> On 2008-02-08, at 00:49, Todd Krause wrote:
>
>>
>> Hi,
>>
>> I'm new to this list and to lam-mpi, so forgive me if I'm asking a
>> common
>> question. I tried to find the answer on the FAQs and archives, but
>> either
>> didn't find it, or didn't know it when I saw it.
>>
>> I've installed lam-mpi on my MacBook Pro, OS 10.5, using Fink. I'm
>> trying
>> to use lam-mpi to run the downloadable cosmology code Gadget2. When
>> I run
>> the program using mpiexec, I get the following output:
>>
>> 05:00 PM btpro:Gadget2> lamboot -v
>>
>> LAM 7.1.3/MPI 2 C++/ROMIO - Indiana University
>>
>> n-1<85442> ssi:boot:base:linear: booting n0 (localhost)
>> n-1<85442> ssi:boot:base:linear: finished
>> 05:01 PM btpro:Gadget2> lamnodesn0 localhost:1:origin,this_node
>> 05:01 PM btpro:Gadget2> lam-mpiexec -np 1 ./Gadget2
>> ../bobtodd_tests/thick_disk_test01/thick_disk01_paramfiles/
>> thick_disk01.param
>> --------------------------------------------------------------------------
>> Failed to find the following executable:
>>
>> Host: Macintosh.local
>> Executable: lam_appschema_ysCT98
>>
>> Cannot continue.
>> --------------------------------------------------------------------------
>> 05:01 PM btpro:Gadget2>
>>
>> I have not yet been able to figure out what this problem is. Any
>> help
>> would be greatly appreciated. (I recently figured out that, yes, I
>> should
>> be using OpenMPI, bundled with 10.5... . I'll cross that bridge
>> once I
>> get this ironed out. :-)
>>
>> Thanks very much,
>> Todd
>>
>> PS -- output of laminfo follows.
>>
>> 02:46 PM btpro:Gadget2> laminfo
>> LAM/MPI: 7.1.3
>> Prefix: /sw
>> Architecture: i386-apple-darwin9.1.0
>> Configured by: root
>> Configured on: Thu Feb 7 14:48:44 CST 2008
>> Configure host: Macintosh.local
>> Memory manager: darwin7malloc
>> C bindings: yes
>> C++ bindings: yes
>> Fortran bindings: yes
>> C compiler: gcc
>> C++ compiler: g++
>> Fortran compiler: /sw/bin/gfortran
>> Fortran symbols: underscore
>> C profiling: yes
>> C++ profiling: yes
>> Fortran profiling: yes
>> C++ exceptions: no
>> Thread support: yes
>> ROMIO support: yes
>> IMPI support: no
>> Debug support: no
>> Purify clean: no
>> SSI boot: globus (API v1.1, Module v0.6)
>> SSI boot: rsh (API v1.1, Module v1.1)
>> SSI boot: slurm (API v1.1, Module v1.0)
>> SSI coll: lam_basic (API v1.1, Module v7.1)
>> SSI coll: shmem (API v1.1, Module v1.0)
>> SSI coll: smp (API v1.1, Module v1.2)
>> SSI rpi: crtcp (API v1.1, Module v1.1)
>> SSI rpi: lamd (API v1.0, Module v7.1)
>> SSI rpi: sysv (API v1.0, Module v7.1)
>> SSI rpi: tcp (API v1.0, Module v7.1)
>> SSI rpi: usysv (API v1.0, Module v7.1)
>> SSI cr: self (API v1.0, Module v1.0)
>>
>> _______________________________________________
>> This list is archived at http://www.lam-mpi.org/MailArchives/lam/
>
> _______________________________________________
> This list is archived at http://www.lam-mpi.org/MailArchives/lam/

-- 
Jeff Squyres
Cisco Systems