Hello,
I'm having problems which I'm hope someone can help solve.
I'm currently implementing an evolutionary algorithm from a sequential
version to a parallel version. I'm using a master/slave model and am
building a scaled down model which is conceptually similar to what the
main algorithm will do in terms of operations in an attempt to ensure
that the message passing is correct and operational integrity is
maintained.
The operations in this model are similar to what the main algorithm
will do, only scaled down to arbitrary problems while I am also using
hard coded values to check that I am actually correctly passing my
data and so can see exactly what is happening and that was is expected
is returned.
The algorithm works by initialising a 2d population array within a
given boundry and then sending this population of vectors to all of
the slaves.
The slaves will perform the algorithm (mutation, crossover, selection)
or in the case of this scaled down version, just perform some
arbitrary operation to ensure that the data they send back is
different. I am just multiplying the 1st data element in the 2d array
population by the slave id to ensure the master receives different
results from the slaves.
The algorithm uses a target function to get cost values from the
vectors in the population after it creates new trial members by
mutation and crossover as in any standard genetic algorithm. This cost
value is then of course used to find the fittest solutions.
I propose the simple model whereby the slaves all return back their
best cost, so in this scaled down model that is simply represented by
the first element of the population array. The master compares all
results returned and requests the population from which the best cost
has been sent. It does this by sending a "dummy" population request
variable. This new population is then mapped onto the population array
that is sent out at the beginning of the Master loop process.
I have commented out the Master while loop here so as to you can see
what happens before and after use. This model was based somewhat on
the skeleton provided on the LAM website, however in that model it
uses the idea that "work" is a finite set and the master's termination
conditions are while (work != NULL). As the goal of this algorithm is
to minimise a function, it is not known how many times it will need to
run so termination condition would be while ("function not
minimized").
However, as soon as I start implementing my while condition, I get
runtime errors.
I'm thinking I am trying to re-assign the new population sent to the
slaves onto an array which is already initialised after the first send
call, so am thinking that correct pointers would be needed so that the
slave receives a population and sends it to a pointer which can be
reassigned each time, my attempts to try this have ended in even more
strange results so perhaps someone could offer advice on a viable
solution?
Theoretically is this master/slave model viable or usable?? Am I going
about the problem the wrong way? If this model is not viable perhaps
someone could suggest alternatives or can I just get around it with
pointers with which I am getting strange results at the moment
Many thanks in advance for any solutions to this implementation
|