So I can't tell from this mail -- does this mean that BLACS is now
working for you, or does this mean that this is what you originally
did, and BLACS is still not working for you?
On Mar 29, 2005, at 3:36 PM, Srinivasa Prade Patri wrote:
> Hi!
> I did look in to the web page and set the following in Bmake.inc
>
> SYSINC =
>
> INTFACE = -Df77IsF2C
> /* when i ran xintface i got the following output
> For this platform, set INTFACE = -Df77IsF2C */
>
> TRANSCOMM = -DUseMpi2
> /* when i ran xtc_CsameF77 i got the following output
> [patri_at_kemc ~]$ mpirun -v -np 2 /home/BLACS/INSTALL/EXE/xtc_CsameF77
> 23986 /home/BLACS/INSTALL/EXE/xtc_CsameF77 running on n0 (o)
> 23987 /home/BLACS/INSTALL/EXE/xtc_CsameF77 running on n0 (o)
> If this routine does not complete successfully,
> Do _NOT_ set TRANSCOMM = -DCSameF77
>
>
> Do _NOT_ set TRANSCOMM = -DCSameF77 */
>
> F77 = mpif77
>
> CC = mpicc
>
> Thanking you.
>
> Regards
> Srinivasa Patri
>
> -----Original Message-----
> From: Jeff Squyres <jsquyres_at_[hidden]>
> To: General LAM/MPI mailing list <lam_at_[hidden]>, sppatr2_at_[hidden]
> Date: Tue, 29 Mar 2005 14:23:21 -0500
> Subject: Re: LAM: Problem running the tester for BLACS
>
> Have you see this page on how to build BLACS with LAM/MPI?
>
> http://www.lam-mpi.org/3rd-party/blacs.php3
>
> It may be helpful in making the BLACS code do the Right Things with
> regard to LAM/MPI.
>
>
> On Mar 29, 2005, at 2:00 PM, Srinivasa Prade Patri wrote:
>
>> Hi!
>> First of all i would like thank all for your support. I got my
>> previous problem solved and the test suite has complied succesfully.
>> But when running the exe files iam getting the following errors listed
>> below. What could be the problem. I once again like to thank all of
>> you for your support.
>>
>> ----------------------------------------------------------------------
>> -
>> --------------
>> [patri_at_e01 ~]$ mpirun -v C
>> /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0
>> 1435 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n0 (o)
>> 1196 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n1
>> 1194 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n2
>> 1194 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n3
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n4
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n5
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n6
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n7
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n8
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n9
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n10
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n11
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n12
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n13
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n14
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n15
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n16
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n17
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n18
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n19
>> 1191 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n20
>> 1192 /home/BLACS/TESTING/EXE/xFbtest_MPI-LINUX-0 running on n21
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=0, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 0,
>> MPI_COMM_WORLD)
>> Rank (0, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (0, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (0, MPI_COMM_WORLD): - main()
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=2, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=3, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=1, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 2,
>> MPI_COMM_WORLD)
>> Rank (2, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (2, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (2, MPI_COMM_WORLD): - main()
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 3,
>> MPI_COMM_WORLD)
>> Rank (3, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (3, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (3, MPI_COMM_WORLD): - main()
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 1,
>> MPI_COMM_WORLD)
>> Rank (1, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (1, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (1, MPI_COMM_WORLD): - main()
>> ----------------------------------------------------------------------
>> -
>> ------
>> One of the processes started by mpirun has exited with a nonzero exit
>> code. This typically indicates that the process finished in error.
>> If your process did not finish in error, be sure to include a "return
>> 0" or "exit(0)" in your C code before exiting the application.
>>
>> PID 1435 failed on node n0 (10.5.0.1) with exit status 1.
>> ----------------------------------------------------------------------
>> -
>> ------
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=18, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=5, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 18,
>> MPI_COMM_WORLD)
>> Rank (18, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (18, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (18, MPI_COMM_WORLD): - main()
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 5,
>> MPI_COMM_WORLD)
>> Rank (5, MPI_COMM_WORLD): Call stack within LAM:
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=14, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> Rank (5, MPI_COMM_WORLD): - MPI_Comm_group()
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 14,
>> MPI_COMM_WORLD)
>> Rank (14, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (5, MPI_COMM_WORLD): - main()
>> Rank (14, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (14, MPI_COMM_WORLD): - main()
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=20, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 20,
>> MPI_COMM_WORLD)
>> Rank (20, MPI_COMM_WORLD): Call stack within LAM:
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=21, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> Rank (20, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (20, MPI_COMM_WORLD): - main()
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 21,
>> MPI_COMM_WORLD)
>> Rank (21, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (21, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (21, MPI_COMM_WORLD): - main()
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=16, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 16,
>> MPI_COMM_WORLD)
>> Rank (16, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (16, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (16, MPI_COMM_WORLD): - main()
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=17, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 17,
>> MPI_COMM_WORLD)
>> Rank (17, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (17, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (17, MPI_COMM_WORLD): - main()
>> BLACS WARNING 'No need to set message ID range due to MPI
>> communicator.'
>> from {-1,-1}, pnum=19, Contxt=-1, on line 18 of file 'blacs_set_.c'.
>>
>> MPI_Comm_group: invalid communicator: Invalid argument (rank 19,
>> MPI_COMM_WORLD)
>> Rank (19, MPI_COMM_WORLD): Call stack within LAM:
>> Rank (19, MPI_COMM_WORLD): - MPI_Comm_group()
>> Rank (19, MPI_COMM_WORLD): - main()
>> ----------------------------------------------------------------------
>> -
>> --------------
>>
>> Regards
>> Srinivasa Patri
>>
>>
>> _______________________________________________
>> This list is archived at http://www.lam-mpi.org/MailArchives/lam/
>>
>
> --
> {+} Jeff Squyres
> {+} jsquyres_at_[hidden]
> {+} http://www.lam-mpi.org/
>
>
>
>
>
> _______________________________________________
> This list is archived at http://www.lam-mpi.org/MailArchives/lam/
>
--
{+} Jeff Squyres
{+} jsquyres_at_[hidden]
{+} http://www.lam-mpi.org/
|