LAM/MPI logo

LAM/MPI General User's Mailing List Archives

  |   Home   |   Download   |   Documentation   |   FAQ   |   all just in this list

From: Elie Choueiri (elie.choueiri_at_[hidden])
Date: 2007-03-30 05:27:06


Organize your code differently.

Instead of saying id = MPI::COMM_WORLD.Get_rank(); in Objective1(), use it
in main().

Then you can say:

if (id == 0) { // master
// do master work
}
else
{ // slave work
}

and if master and slave both call the same function anyway, you can ignore
the if's and just call the function..

On 3/30/07, Muhammad Faiz Misman <faizmisman_at_[hidden]> wrote:
>
> Yes you're right. the slave never enter the function GA() and Objective1(),
> thanks for your help. Is anybody can help me on giving some advice or idea
> on how im gonna call the slave in function Objective1() ? because in my
> program, main() only for master.
>
> On 3/28/07, Elie Choueiri <elie.choueiri_at_[hidden]> wrote:
> >
> > Hello.
> > I'm not very familiar with the C++ calls, but some points to watch out
> > for over a quick scan of the code:
> >
> > 1) Do the slaves ever call MPI::Finalize(); ?
> > 2) Do the slaves ever enter ga() or Objective1(); ? main() only has the
> > master doing anything..
> >
> >
> > On 3/28/07, Muhammad Faiz Misman <faizmisman_at_[hidden]> wrote:
> >
> > > Hi. now im doing parallel GA for my final project using C++. But now
> > > im having a problem with my program. when im running my program this kind of
> > > error happened :
> > >
> > > One of the processes started by mpirun has exited with a nonzero exit
> > >
> > > code. This typically indicates that the * process* finished in error.
> > >
> > > If your * process* did not finish in error, be sure to include a
> > > "return
> > > * 0* " or "exit(0)" in your C code before exiting the application.
> > >
> > > PID 26349 failed on node n0 (10.1.1.1) due to signal 13 .
> > >
> > > I hope there is someone can help me explain whats wrong with my
> > > program and whats the maeaning by signal 13. Thanks
> > >
> > > Here is algorithm of my program (most of my parallel program at
> > > function Objective1) :
> > >
> > > .............................................................................................................................................................................................................
> > >
> > >
> > > int main (argc, argv) { //all in main is master part
> > >
> > > MPI::Init(argc, argv);
> > >
> > > master = id = 0;
> > >
> > > if (id == master) {
> > >
> > > ga(); //calling the function GA
> > >
> > > MPI::Finalize();
> > >
> > > } //end master
> > >
> > > } // end main
> > > .
> > > .
> > > .
> > > void ga(); {
> > > .
> > > .
> > > .
> > > float Objective1();
> > >
> > > } //function ga
> > >
> > > void svm_train(); //function svm_train
> > >
> > > float Objective1(); float Objective1(GAGenome& g)
> > > {
> > >
> > > int p, id, slave;
> > > int master = 0;
> > > int tag = 15;
> > > MPI::Status status;
> > >
> > > id = MPI::COMM_WORLD.Get_rank();
> > > p = MPI::COMM_WORLD.Get_size();
> > >
> > > MPI::Request send[p], recv[p];
> > >
> > > GA1DBinaryStringGenome & genome = (GA1DBinaryStringGenome &)g;
> > > fstream file_index(indexf,ios::trunc|ios::in|ios::out);//create file
> > > for index output
> > > double score=0.0;
> > >
> > > if (id == master) {
> > >
> > > if(!file_index)
> > > {
> > > cerr << "Cannot open index1 file " << file_index << " for
> > > input.\n";
> > > exit(1);
> > > }
> > >
> > > cout<<"chromosome value "<<++bilang<<": ";
> > > for (int i=0; i<genome.length(); i++)
> > > {
> > > cout<<(genome.gene(i));
> > > file_index<<( genome.gene(i));
> > > }
> > > cout<<" ";
> > > file_index<<endl;
> > > file_index.close();
> > >
> > > for (slave = 1; slave < p; slave++) {
> > > //cout<<"sending to slave "<<slave<<" from "<<id<<endl;
> > > send[slave] = MPI::COMM_WORLD.Isend(&indexf, 1, MPI::CHAR,
> > > bilang, tag);
> > > send[slave] = MPI::COMM_WORLD.Isend(&file_index, 1, MPI::CHAR,
> > > bilang, tag);
> > >
> > > }//end for
> > >
> > > }//end master
> > > else {
> > >
> > > recv[slave] = MPI::COMM_WORLD.Irecv(&indexf, 1, MPI::CHAR, master,
> > > tag);
> > > recv[slave] = MPI::COMM_WORLD.Irecv(&file_index, 1, MPI::CHAR,
> > > master, tag);
> > >
> > > featureselection_train(); //select features (genes) from
> > > s2AMLALL_train.fs.data
> > > featureselection_test(); //select features (genes) from
> > > s2AMLALL_test.fs.data
> > > svm_train();
> > >
> > > send[slave] = MPI::COMM_WORLD.Isend(&cvscore, 1, MPI::DOUBLE,
> > > master, tag);
> > >
> > > }//end slaves
> > >
> > > if (id == master) {
> > >
> > > for (slave = 1; slave < bilang; slave++) {
> > > recv[slave] = MPI::COMM_WORLD.Irecv(&cvscore, 1, MPI::CHAR, slave,
> > > tag);
> > >
> > > }//end for
> > >
> > > //MPI::Request::Waitall(p, recv);
> > >
> > > //svm__test();
> > > //score=testscore;
> > > cout<<" cvs in fitness: "<<cvscore<<" "<<endl;
> > >
> > > score=cvscore;
> > >
> > > }//end master
> > >
> > > return score;
> > > }
> > >
> > >
> > >
> > >
> > >
> > > _______________________________________________
> > > This list is archived at http://www.lam-mpi.org/MailArchives/lam/
> > >
> >
> >
> >
> > --
> > (N)E
> > _______________________________________________
> > This list is archived at http://www.lam-mpi.org/MailArchives/lam/
> >
>
>
> _______________________________________________
> This list is archived at http://www.lam-mpi.org/MailArchives/lam/
>

-- 
(N)E