MPI is a standard; LAM is an implementation of that standard.
Here is the description of what MPI_Reduce is supposed to do:
http://www.mpi-forum.org/docs/mpi-11-html/node77.html#Node77
I suspect that the "op" argument of your call to MPI_Reduce is
incorrect somehow.
On Feb 1, 2008, at 12:39 AM, Abraham wrote:
> Hi;
> Many thanks Jeff for your reply but I am not familiar with LAM,
> because of that I couldn't solve
> this problem and I have looked throuh most of LAM doucomentation but
> I did not find what doese
> MPI_Reduce mean.
> I still have trouble running the model on multiple CPUs although I
> got it run on a
> single CPU. I got this error message:
>
> MPI_Reduce: invalid operation: Invalid argument (rank 0,
> MPI_COMM_WORLD)
> Rank (0, MPI_COMM_WORLD): Call stack within LAM:
> Rank (0, MPI_COMM_WORLD): - MPI_Reduce()
> Rank (0, MPI_COMM_WORLD): - main()
>
>
>
> Can anyone tell me what is the function of MPI-REDUCE and how I fix
> this problem . I have mpicc,
> mpiCC and mpif77 instead of mpif90.Do I need necessarily mpif90 or
> both of them can perform
> the same function?
>
> Many thanks for your assistance in advance
> Abraham
>
>
> I replied to you 3 days ago, please see:
>
> http://www.lam-mpi.org/MailArchives/lam/2008/01/13562.php
>
>
> On Jan 22, 2008, at 8:45 PM, Abraham wrote:
>
>> Hi All,
>>
>>
>> I am having a run-time problem . I've got my application
>> successfully run on one processer
>> ;however ; when I tried to run it on multiple processer, an error
>> message was returned at the end
>> of rsl.out.0000 file which is the output details on the node 0 .
>> This error can be seen below:
>>
>>> MPI_Reduce: invalid operation: Invalid argument (rank 0,
>>> MPI_COMM_WORLD)
>>> Rank (0, MPI_COMM_WORLD): Call stack within LAM:
>>> Rank (0, MPI_COMM_WORLD): - MPI_Reduce()
>>> Rank (0, MPI_COMM_WORLD): - main()
>>
>> laminfo:
>>
>> LAM/MPI: 7.1.1
>> Prefix: /usr/local/lam-7.1.1-pgi
>> rchitecture: x86_64-unknown-linux-gnu
>> Configured by: root
>> Configured on: Fri Jan 14 16:39:09 EST 2005
>> Configure host: matrix
>> Memory manager: ptmalloc2
>> C bindings: yes
>> C++ bindings: yes
>> Fortran bindings: yes
>> C compiler: pgcc
>> C++ compiler: pgCC
>> Fortran compiler: pgf90
>> Fortran symbols: underscore
>> C profiling: yes
>> C++ profiling: yes
>> Fortran profiling: yes
>> C++ exceptions: no
>> Thread support: yes
>> ROMIO support: yes
>> IMPI support: no
>> Debug support: no
>> Purify clean: no
>> SSI boot: globus (API v1.1, Module v0.6)
>> SSI boot: rsh (API v1.1, Module v1.1)
>> SSI boot: slurm (API v1.1, Module v1.0)
>> SSI boot: tm (API v1.1, Module v1.1)
>> SSI coll: lam_basic (API v1.1, Module v7.1)
>> SSI coll: shmem (API v1.1, Module v1.0)
>> SSI coll: smp (API v1.1, Module v1.2)
>> SSI rpi: crtcp (API v1.1, Module v1.1)
>> SSI rpi: lamd (API v1.0, Module v7.1)
>> SSI rpi: sysv (API v1.0, Module v7.1)
>> SSI rpi: tcp (API v1.0, Module v7.1)
>> SSI rpi: usysv (API v1.0, Module v7.1)
>> SSI cr: self (API v1.0, Module v1.0)
>>
>> Can anyone figure the problem out? If so, please tell me because
>> this problem has me completely
>> stuck .
>> Your assistance will highly be appreciated .
>>
>> Thanks in advance
>> Abraham
>> Australia
>
>
>
> ____________________________________________________________________________________
> Looking for last minute shopping deals?
> Find them fast with Yahoo! Search. http://tools.search.yahoo.com/newsearch/category.php?category=shopping
> _______________________________________________
> This list is archived at http://www.lam-mpi.org/MailArchives/lam/
--
Jeff Squyres
Cisco Systems
|