Brian Barrett wrote:
> On Jun 10, 2006, at 7:42 PM, Jeff Whitaker wrote:
>
>
>> I'm trying to run an operational weather prediction code from the
>> National Weather Service on a cluster of dual-G5 macs using xlf and
>> lam
>> 7.1.2. On the first call to mpi_allreduce, I get this error
>>
>> MPI_Allreduce: invalid operation: Invalid argument (rank 0,
>> MPI_COMM_WORLD)
>> Rank (0, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (0, MPI_COMM_WORLD): - MPI_Allreduce()
>> Rank (0, MPI_COMM_WORLD): - main()
>>
>> where the call looks like this
>>
>> call mpi_allreduce(super_val,super_val1,superp+1,mpi_rtype,&
>> mpi_sum,mpi_comm_world,ierror)
>>
>>
>> mpi_rtype is set to mpi_real8, and super_val and super_val1 are real*8
>> arrays of length superp+1. The same code works on a linux cluster
>> with
>> mpich. Is mpi_allreduce supported with mpi_real8 and mpi_sum?
>>
>
> LAM does not support using the MPI predefined operations with the
> optional Fortran datatypes (such as MPI_REAL8). It is unlikely we
> will be adding such support, as we are winding down support for LAM/
> MPI in favor of Open MPI. We'll gladly accept patching adding such
> functionality, of course. If switching to Open MPI is not an option
> and hacking at the internals of LAM isn't your thing, the final
> option is to provide a user-defined reduction operation that can deal
> with MPI_REAL8 datatypes and use that instead of MPI_SUM. Hardly
> perfect, but it will work in a pinch.
>
> By the way, just for clarification... The MPI standard is really
> fuzzy on whether MPI_SUM is required to work with MPI_REAL8. It's
> not explicitly listed in the types MPI_SUM should work with. On the
> other hand, it seems pretty dumb to have a predefined datatype that
> isn't supported by MPI_SUM. This happened in LAM because MPI_REAL8
> was added much later than the other datatypes and whomever added it
> simply forgot to update the reduction code.
>
> Brian
>
>
>
Brian: Thanks for the explanation. It's unfortunate, since MPI_SUM
with real8 is a fairly common construct, at least in the codes I deal
with. I've tried openmpi, but am encountering mysterious crashes on OS
X. mpich2 seems to work fine though, so I'll go with that for now.
-Jeff
--
Jeffrey S. Whitaker Phone : (303)497-6313
Meteorologist FAX : (303)497-6449
NOAA/OAR/PSD R/PSD1 Email : Jeffrey.S.Whitaker_at_[hidden]
325 Broadway Office : Skaggs Research Cntr 1D-124
Boulder, CO, USA 80303-3328 Web : http://tinyurl.com/5telg
|