LAM/MPI logo

LAM/MPI General User's Mailing List Archives

  |   Home   |   Download   |   Documentation   |   FAQ   |   all just in this list

From: Jeff Squyres (jsquyres_at_[hidden])
Date: 2004-11-18 07:49:46


On Nov 18, 2004, at 4:45 AM, Davide Cesari wrote:

> May I add acomment: precompiled binaries are now unapplicable for
> those of us (I think many) who use fortran compilers, which
> unfortunately do not have a common calling convention; if some effort
> could be put on finding a way to provide different fortran interfaces
> satisfying most of the conventions invented, choosable in some way at
> runtime (I mean at the time of running the 'mpif77' command) along
> with the compiler name, then I think more people could at least start
> with the precompiled binaries, and the building problem would be
> postponed to a later optimisation phase.

Many thanks for your comments!

One note on this, since a few people have brought it up...

In Open MPI, we stole a trick from LA-MPI's book (Los Alamos MPI). On
systems that support it (such as Linux), we compile weak symbols for
all 4 Fortran 77 compiler conventions. That is, we compile the fortran
wrappers once, but they contain 4 weak symbols -- one for each of the
"normal" F77 compiler conventions. Hence, one installation of Open MPI
should be suitable for all F77 compilers.

Some systems do not support this, however. OS X immediately leaps to
mind. In those situations, Open MPI behaves like LAM/MPI -- it
compiles one set of Fortran bindings for the F77 compiler that was
found during configure.

So this makes Fortran 77 and C work nicely on many platforms, and
hopefully reduces the number of MPI installations you need for your
site.

The same does not hold true for F90 and C++, unfortunately. With F77,
there are 4 ways that compilers do symbol munging. With F90 and C++,
there is no uniformity. We're toying around with a few ways to still
have "one" installation (e.g., have something like libmpi_g++.so,
libmpi_icpc.so, etc., and have the wrapper intelligently choose between
them), but haven't had time to come up with a final solution there yet.

-- 
{+} Jeff Squyres
{+} jsquyres_at_[hidden]
{+} http://www.lam-mpi.org/