Thanks Artur and Jeff,
good suggestions -- both of them. i tried mpirun, and it worked (yes, you
might think, well why didn't he try that before? i *did*, and it screwed
up as well... but differently. and then later i rebuilt everything; and
then again. and *then* i forgot to try that... so a dumb mistake, but i
did try it back in the day).
okay, now that i got it to work, i can feel like i'm leaving no loose ends
as i try to get OpenMPI to work. but that means i have to get fftw2 to
work by myself, since Fink configures it with LAM-MPI, not OpenMPI. but
that gives me a whole litany of errors... and i'm using the exact same
installation procedure i used when i installed it (correctly) before...
but those were the days of OS 10.4. guess that's the price i pay for
innovation.
oh, one other question: i created a .lamhosts file in my home directory,
which contains the one line
Macintosh.local cpu=2
is this file a) correctly written, b) in the right place, given that i'm
running lam in a totally different directory, and c) irrelevant if i'm
trying to run on only one computer (even though i'm hoping to use both
cpus)?
anyhow, thanks again to both of you.
best,
todd
On Thu, 7 Feb 2008, Jeff Squyres wrote:
> It sounds like lam's mpiexec script is misbehaving somehow -- you
> might try mpirun, instead (in LAM, mpiexec is a perl script wrapper
> around mpirun).
>
> Or you might try switching to Open MPI. :-)
>
>
> On Feb 7, 2008, at 8:05 PM, Artur Tyliszczak wrote:
>
>> Hi Todd,
>>
>> Better switch to OpenMPI as soon as possible.
>> I spent couple of days installing and then trying to run LAM 7.1.2 and
>> 7.1.4 on the same computer and OS you have.
>> I managed to install LAM, then I was able to compile my parallel codes
>> BUT there was no way to run them.
>> Strange errors were reported and finally I gave up.
>>
>> Jeff Squyres suggested to go for OpenMPI. I did it. NO MORE PROBLEMS
>> and everything works fine.
>>
>> Regards,
>> Artur.
>>
>> On 2008-02-08, at 00:49, Todd Krause wrote:
>>
>>>
>>> Hi,
>>>
>>> I'm new to this list and to lam-mpi, so forgive me if I'm asking a
>>> common
>>> question. I tried to find the answer on the FAQs and archives, but
>>> either
>>> didn't find it, or didn't know it when I saw it.
>>>
>>> I've installed lam-mpi on my MacBook Pro, OS 10.5, using Fink. I'm
>>> trying
>>> to use lam-mpi to run the downloadable cosmology code Gadget2. When
>>> I run
>>> the program using mpiexec, I get the following output:
>>>
>>> 05:00 PM btpro:Gadget2> lamboot -v
>>>
>>> LAM 7.1.3/MPI 2 C++/ROMIO - Indiana University
>>>
>>> n-1<85442> ssi:boot:base:linear: booting n0 (localhost)
>>> n-1<85442> ssi:boot:base:linear: finished
>>> 05:01 PM btpro:Gadget2> lamnodesn0 localhost:1:origin,this_node
>>> 05:01 PM btpro:Gadget2> lam-mpiexec -np 1 ./Gadget2
>>> ../bobtodd_tests/thick_disk_test01/thick_disk01_paramfiles/
>>> thick_disk01.param
>>> --------------------------------------------------------------------------
>>> Failed to find the following executable:
>>>
>>> Host: Macintosh.local
>>> Executable: lam_appschema_ysCT98
>>>
>>> Cannot continue.
>>> --------------------------------------------------------------------------
>>> 05:01 PM btpro:Gadget2>
>>>
>>> I have not yet been able to figure out what this problem is. Any
>>> help
>>> would be greatly appreciated. (I recently figured out that, yes, I
>>> should
>>> be using OpenMPI, bundled with 10.5... . I'll cross that bridge
>>> once I
>>> get this ironed out. :-)
>>>
>>> Thanks very much,
>>> Todd
>>>
>>> PS -- output of laminfo follows.
>>>
>>> 02:46 PM btpro:Gadget2> laminfo
>>> LAM/MPI: 7.1.3
>>> Prefix: /sw
>>> Architecture: i386-apple-darwin9.1.0
>>> Configured by: root
>>> Configured on: Thu Feb 7 14:48:44 CST 2008
>>> Configure host: Macintosh.local
>>> Memory manager: darwin7malloc
>>> C bindings: yes
>>> C++ bindings: yes
>>> Fortran bindings: yes
>>> C compiler: gcc
>>> C++ compiler: g++
>>> Fortran compiler: /sw/bin/gfortran
>>> Fortran symbols: underscore
>>> C profiling: yes
>>> C++ profiling: yes
>>> Fortran profiling: yes
>>> C++ exceptions: no
>>> Thread support: yes
>>> ROMIO support: yes
>>> IMPI support: no
>>> Debug support: no
>>> Purify clean: no
>>> SSI boot: globus (API v1.1, Module v0.6)
>>> SSI boot: rsh (API v1.1, Module v1.1)
>>> SSI boot: slurm (API v1.1, Module v1.0)
>>> SSI coll: lam_basic (API v1.1, Module v7.1)
>>> SSI coll: shmem (API v1.1, Module v1.0)
>>> SSI coll: smp (API v1.1, Module v1.2)
>>> SSI rpi: crtcp (API v1.1, Module v1.1)
>>> SSI rpi: lamd (API v1.0, Module v7.1)
>>> SSI rpi: sysv (API v1.0, Module v7.1)
>>> SSI rpi: tcp (API v1.0, Module v7.1)
>>> SSI rpi: usysv (API v1.0, Module v7.1)
>>> SSI cr: self (API v1.0, Module v1.0)
>>>
>>> _______________________________________________
>>> This list is archived at http://www.lam-mpi.org/MailArchives/lam/
>>
>> _______________________________________________
>> This list is archived at http://www.lam-mpi.org/MailArchives/lam/
>
>
> --
> Jeff Squyres
> Cisco Systems
>
> _______________________________________________
> This list is archived at http://www.lam-mpi.org/MailArchives/lam/
>
|