Project Information
Authors
 Antonio J. Nebro (University of Málaga)
 Francisco Luna (University of Málaga)
 Enrique Alba (University of Málaga)
If you have any question, please send a message to: Antonio J. Nebro (antonio@lcc.uma.es)
Version
1.0 (Last modification: 1 th March 2006)Table of Contents
Introduction
The DEME project is devoted to the study of multiobjective optimization problem solving by using sequential, parallel, and distributed metaheuristic algorithms. Metaheuristics are highlevel techniques driving a number of underlying heuristic methods in the search of an optimum. In the field of multiobjective optimization, this optimum is the set of solutions known as Pareto Optimal Set (or Pareto Front, if we consider the objective space).Our main goal is the design and development of parallel and distributed multiobjective metaheuristic algorithms. There are at least two possible ways to deal with this objective: first, to parallelize an existing sequential algorithm and, second, to design a new parallel metaheuristic. While the second approach is very attractive, we consider that the logical path is to begin first with the study of the parallelization of wellknown sequential algorithms. Furthermore, the codes of some of these algoritms are freely available; in particular, there are implementations of PAES, NSGAII, and SPEA2 in C/C++.
However, the obvious fact that each algorithm has been implemented by different teams introduces some difficulties. On the one hand, comparing issues like running times in a fair way is not an easy task; for example, the available implemention of NSGAII (NSGAII) makes a lot of file writtings, while SPEA2 (SPEA2) uses a file sharing mechanism to connect a solver process and a variator process. On the other hand, our idea of modifying the algorithms to develop distributed versions of them would force us to write a specific version for each of them, and we would like to reuse the distribution code in all the algorithms. These drawbacks lead us to a consider as a new first objective the development from scratch of implementations of some sequential metaheuristics to try to avoid the mentioned inconveniences. An additional advantage of implementing the algorithms ourselves is that we can gain a better understading of the working principles of those algorithms.
Although our main objective is applying parallelism to multiobjective optimization, it is difficult to resist the temptation of design new sequential metaheuristics for multiobjective optimization, due that this is still an open research area. As a result, we have designed and implemented two new algorithms for solving MOPs: SSMO (Scatter Search Multiobjective Optimization) and AbYSS (Archive based hYbrid Scatter Search).
This project begun in Autumm 2003, and it is related to the project ESaM (Enumerative Search applied to Multiobjective optimization).
The current goals of the project are described in Goals, the current status of the project is detailed in Current Status, and the things remaining to do are included in todo
Goals
The current goals of the project are summarized in the following points:
Design and implementation of a library of classes for building metaheuristics. The library is aimed at enabling the implementation of populationbased metaheuristics, as it is the case of many evolutionary algorithms. This library, named
libmoea
, must include the building blocks (genes, chromosomes, individuals, populations) that conform a populationbased algorithm, as well as a rich set of operators for mutation, selection, recombination, niching, and archiving. 
Build a library of multiobjective problems, named
libproblem
, that can be used by any metaheuristic. Problems must be easily added to this library, without requiring no recompiling nor modification of the librarylibmoea
or the algorithms. 
Development of sequential metaheuristica algorithms by using
libmoea
. These algorithms will be evaluated by using the problems of the librarylibproblem
. The codes implementing these algorithms must be easy to understand and modify; for these reasons, the programs should follow as possible their description in the papers where they were presented.  Design, implementation and evaluation of parallel and distributed metaheuristics.
The programming language choosen to develop this project is C++, although Java is being considered for future developments.
Current Status
The following points describe the current status of the project:
The library
libmoea
contains a class namedGene
, which is the base class of several gene representations (BinaryGene
,RealGene
,IntegerGene
, andBinaryRealGene
). As usual, aChromosome
is composed of genes, anIndividual
contains a chromosome and array of fitness values, and aPopulation
is a collection of individuals. 
The library
libproblem
contains at this moment around 35 multiobjective optimization problems. They are listed inProblemHeaders.h
. Many of them are taken from the book of Coello et al. [2002], others are included in the book of K.Deb [2002], and others are taken from the specialized literature. Furthermore, a number of singleobjective optimization are included as well for testing purposes. 
All the problems inherit from the class
MultiobjectiveProblem
, which contains, among others, two methods:evaluate()
andevaluateConstraints()
. This class is designed to provide a flexible method for describing a MOP. Thus, for example, those problems requiring real variables can be created taken as a parameter the desired representation of the genes (currently, real, binary coded real, or binary grey coded real). 
libmoea
contains a class namedMoEA
, which the base class of all the developed MOEAs. This class allows to especify the mutation and crossover operators that will be used by the MOEA algorithm. 
The following list details the metaheuristics developed so far:

Nsga2
. The NSGAII algorithm described in [Deb et al, 2003]. 
Ssmo1
. The SSMOv1 version of the Scatter Search algorithm for Multiobjective Optimization algorithm (SSMO)that uses the ranking and crowding mechanisms of NSGAII in the reference set update method. 
Ssmo2
. The SSMOv2 version of SSMO using the centroids approach in the reference set update method. 
AbYSS
The Archive based hYbrid Scatter Search algorithm.

 All the algorithms use a simple constraint handling mechanism, which can be based on counting the number of violated constraints (as, for example, the microGA of Coello and Pulido does), of we can measure the overall constrain violation used in NSGAII.
 Currently, there is a bitflip mutation operator for binary individuals, and several mutant operators for realcoded problems, including random, polynomial, and uniform mutation.
 The crossover operators implemented up to now include singlepoint for binary individuals, and simulatedbinary and BLXalpha for real individuals.
Todo list
The following list describes the ongoing tasks and the next planned enhancements:
 The binarygray representation is incomplete.
 The implementation of the algorithms Nsga2 (NSGAII) must be exhaustively evaluated and compared against the original implementation to ensure that our implementation is correct. This task also implies evaluating the available mutation and crossover operators, as well as the different representations for real variables (real and binarycoded real).
 DEME versions of PAES and SPEA2 are being currently being developed.
 A distributed island version of PAES using MPI has been developed; we are in the process of testing it for inclusion in DEME.
Using DEME
Installation
There is not special mechanism for installing the software; simply, you have to untar the file containing the codes and then compile them typingmake
.The software of the DEME project is composed of the following directories:

moea:
Contains the classes of the librarymoealib

problems:
The multiobjective problems are described here 
multiObjective:
Contains the following algorithms:
nsga2:
the NSGAII algorithm 
ssmo:
the SSMO algorithm 
abyss:
the AbYSS algorithm

Compilation
To compile the programs just type "make" in each directory, following this order: first, the directoriesmoea
and problems
, and then the rest of directories. Alternatively, you can type "make" in the root directory.Executing the Programs
This section contains a general guideline about how to run each of the implemented algorithms.
Let us consider <metah>
the name of an algorithm (Nsga2, Ssmo, Abyss). Each executable problem is named <metah>Seq
, and there must exist a configuration file, named <metah>
.cfg, in the working directory (that is, the directory where the program is invoked from). Once you have modified that file according to your preferences (crossover probability, maximun number of iterations, etc.), just type:
$ <metah>Seq problem
Currently, the program produces two output files, name "VAR" y "FUN". The first one contains the values of the decision variables, while the second one stores the values of the functions. You can change easily the names of these files modifying them in the corresponding <metah>Seq.cpp
program.
Each <metah>Seq.cpp
program can require different invocation parameters. They are documented in the proper code; anyway, a usage message will be printed if the parameters are incorrect.
Adding new Problems
In this section we detail how to define a multiobjective problem to be solved by the applications of the project. The program has been designed to try to simplify this task as possible. The key is to use the inheritance mechanism to provide a base class,MultiobjectiveProblem
, that must be inherited by all the classes representing problems.
Let us suppose we want to add a new unconstrained multiobjective problem called Example
. The easiest way is to take an existing problem as a reference; for example, let us choose the problem Fonseca
. This problem is defined in a class so named, which is defined in the files Fonseca.h
and Fonseca.cpp
, in the directory problems
. The steps required to add the new problem are the following:

Create a header file for the class. In our example, it should be named
Example.h
. 
Create the implementation file
Example.cpp
. In this file we have to define the constructor of the class and the methodevaluate
. The constructor contains sentences to specify the number of objective functions, the number of state variables, the number of constraints, the limits of the state variables, and the number of precission bits per binarycoded real variable. 
Include the header file
Example.h
in the fileProblemHeaders.h

Modify the makefile to add the object file
Example.o
in the macro namedOBJS
.
If the new problem has constraints, then we should define the method evaluateConstraints
in the file Example.cpp
. Currently, this method calculates two values: the overall contraint violation and the number of violated constraints, but it can modified to obtain any other desired value. As examples, see the implementation files of the problems Viennet4
or Tanaka
.
If you are insterested in solving a binary problem, please take the OneMax
or Zdt5
problems as examples.
Download
The lattest version of the software can be downloaded from deme
Copyright
Copyright © 20032006 by Antonio J. Nebro.Permission to use, copy, modify, and distribute this software and its documentation under the terms of the GNU General Public License is hereby granted. No representations are made about the suitability of this software for any purpose. It is provided "as is" without express or implied warranty. See the GNU General Public License for more details.
Description of the Algorithms
This section is intended to contain a detailed description of the implemented algorithms. Currently, it describes the SSMO and AbYSS algorithms.SSMO description
SSMO stands for Scatter Search algorithm for Multiobjective Optimization. We have developed this algorithm with the following goals in mind: Study the application of scatter search for solving multiobjective optimization problems. The resulting algorithm algorithm should be competitive against other wellknown algorithms such as NSGAII or PAES.
 We intend the algorithm be based on concepts as Pareto dominance, niching, and Pareto ranking.
 Our algorithm it is based on the one described in [Glover et al, 2003] for singleobjective optimization. We adapt this algorithm for multiobjective optimization by redefining the improvement and the reference set update methods, as well as the use of the set P as a kind of archive to store the nondominated solutions found during the execution of the algorithm.
For a more detailed description of SSMO, please see [Nebro et al, 2005]
A pseudocode describing the SSMO algorithm is the following:
0. Construct the initial set P 1. // Outer loop 2. until (stop condition) { 3. referenceSetUpdate(build=true) 4. subsetGeneration() 5. // Scatter Search main loop 6. while (new subsets are generated) { 7. combination() 8. for (each combinated individual) { 9. improvement() 10. referenceSetUpdate(build=false) 11. } // for 12. subsetGeneration() 13. } // while 14. add RefSet1 to P 15.} // until 16.cutoff()
Currently we have developed two versions of SSMO. The first version, SSMOv1 (Ssmo1
) uses the ranking and crowding procedures of NSGAII (to obtain the RefSet1) in the reference set update method as well as in the cutoff method. The second version, SSMOv2(Ssmo2
), differs from SSMOv1 in applying a clustering algorithm to obtain centroids (representative individuals) of the individuals having the best ranking in the reference set update method.
AbYSS description
AbYSS stands for Archive based hYbrid Scatter Search algorithm, and it is an alternative approach to SSMO. Compared to this last one algorithm, the main features of AbYSS are: An external archive is used to store the nondominated solutions found durint the execution of the algorithm.

Two different density estimators are used: the crowding distance of NSGAII to manage the archive when it becomes full, and the raw fitness strenght of SPEA2 to selecte the best individuals from the initial set to obtain the
RegSet1
.
For a more detailed description of AbYSS, please see [Nebro et al, 2006].
References
 [Deb et al, 2002]. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T. "A Fast and Elitist Multiobjective Genetic Algorithm". IEEE Transactions on Evolutionary Computation 6 (2002).
 [Glover et al, 2003]. Glover, F., Laguna, M., Martí, R.: "Scatter Search". In Ghosh, A., Tsutsui, S., eds: Advances in Evolutionary Computing: Theory and Applications. Springer (2003):
 [Nebro et at, 2005]. Nebro, A.J., Luna, F., Alba, E.: "New Ideas in Applying Scatter Search to Multiobjective Optimization". EMO2005. Guanajuato, México (2005).
 [Nebro et at, 2006]. Nebro, A. J., Luna, F., Alba, E., Beham, A., Dorronsoro, B: "AbYSS: Adapting Scatter Search for Multiobjective Optimization". Tech. Rep. ITI20062, Dept. Lenguajes y Ciencias de la Computación, University of Málaga (2006).
 [Zitler et al, 2001]. Zitzler,E., Laumanns, M., Thiele, L.: "SPEA2": Improving the Strength Pareto Evolutionary Algorithm for Multiobjective Optimization. Technical Report 103,Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH) Zurich, Gloriastrasse 35, CH8092 Zurich, Switzerland, May 2001.