Hi,
Thanks for pointing out the incorrect configure line. I still ran
into the same problems though. I have finally been able to compile
static programs, after noticing that the conflict was between
/usr/lib64/libmpi and libc. By default, LAM was installing its
libraries into /usr/lib, so the lib64/ library was never getting
updated with the new LAM configuration.
This configure line did the trick:
./configure --with-shared=no --with-memory-manager=none --libdir=/usr/lib64
> I'm not convinced that this is a reason for requiring a static MPI.
> What are you gaining by doing this?
>
We prefer static linking not only because it produces faster
executables, but because we run our software on several machines with
similar/same architectures, and do not want to have to deal with users
requiring various libraries at runtime( we utilize a number of
open-source libraries).
Thankyou for the help!
Umar Farooq,
On 9/21/06, Andrew Friedley <afriedle_at_[hidden]> wrote:
> Umar Farooq wrote:
> > Hi,
> >
> > We are trying to compile a program statically, and running into
> > linking problems with multiple definitions in libc and libmpi, exactly
> > the same as were encountered in this thread:
> > http://www.lam-mpi.org/MailArchives/lam/2005/08/11108.php
> >
> > However, even having configured LAM 7.1.2 with the following command:
> > "./configure --with-shared=no --with-memory-manager-none"
>
> Your configure line isn't quite correct. Try:
>
> ./configure --with-shared=no --with-memory-manager=none
>
> Note '=' instead of '-'. Also, I think --without-shared is preferred,
> but your syntax should work.
>
> > The "multiple definition" problems are still present for all the same
> > functions. This problem is present for even the "hello world" example
> > included in the LAM tarball when linked statically.
> > We do not want dynamic linking because we are using our software for
> > massive parallel programming in a research enviornment.
>
> I'm not convinced that this is a reason for requiring a static MPI.
> What are you gaining by doing this?
>
> Andrew
>
>
> > We are running CentOS, with gcc version 3.4.6.
> > The LAM configure and make work fine, and laminfo gives the following
> > information:
> > LAM/MPI: 7.1.2
> > Prefix: /usr
> > Architecture: x86_64-unknown-linux-gnu
> > Configured by: root
> > Configured on: Wed Sep 13 22:52:26 EDT 2006
> > Configure host: XXX.XXX.edu
> > Memory manager: none
> > C bindings: yes
> > C++ bindings: yes
> > Fortran bindings: yes
> > C compiler: gcc
> > C++ compiler: g++
> > Fortran compiler: g77
> > Fortran symbols: double_underscore
> > C profiling: yes
> > C++ profiling: yes
> > Fortran profiling: yes
> > C++ exceptions: no
> > Thread support: yes
> > ROMIO support: yes
> > IMPI support: no
> > Debug support: no
> > Purify clean: no
> > SSI boot: globus (API v1.1, Module v0.6)
> > SSI boot: rsh (API v1.1, Module v1.1)
> > SSI boot: slurm (API v1.1, Module v1.0)
> > SSI coll: lam_basic (API v1.1, Module v7.1)
> > SSI coll: shmem (API v1.1, Module v1.0)
> > SSI coll: smp (API v1.1, Module v1.2)
> > SSI rpi: crtcp (API v1.1, Module v1.1)
> > SSI rpi: lamd (API v1.0, Module v7.1)
> > SSI rpi: sysv (API v1.0, Module v7.1)
> > SSI rpi: tcp (API v1.0, Module v7.1)
> > SSI rpi: usysv (API v1.0, Module v7.1)
> > SSI cr: self (API v1.0, Module v1.0)
> >
> > Any help is welcome.
> >
> > Thanks,
> > _______________________________________________
> > This list is archived at http://www.lam-mpi.org/MailArchives/lam/
>
> _______________________________________________
> This list is archived at http://www.lam-mpi.org/MailArchives/lam/
>
|