256 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 8, NO. 3, JUNE 2004
Handling Multiple Objectives With
Particle Swarm Optimization
Carlos A. Coello Coello, Member, IEEE, Gregorio Toscano Pulido, and Maximino Salazar Lechuga
Abstract—This paper presents an approach in which Pareto
dominance is incorporated into particle swarm optimization
(PSO) in order to allow this heuristic to handle problems with
several objective functions. Unlike other current proposals to
extend PSO to solve multiobjective optimization problems, our
algorithm uses a secondary (i.e., external) repository of particles
that is later used by other particles to guide their own flight. We
also incorporate a special mutation operator that enriches the
exploratory capabilities of our algorithm. The proposed approach
is validated using several test functions and metrics taken from the
standard literature on evolutionary multiobjective optimization.
Results indicate that the approach is highly competitive and that
can be considered a viable alternative to solve multiobjective
optimization problems.
Index Terms—Evolutionary multiobjective optimization,
multiobjective optimization, multiobjective particle swarm opti-
mization, particle swarm optimization.
I. INTRODUCTION
T
HE USE OF evolutionary algorithms for multiobjective
optimization (an area called “evolutionary multiobjective
optimization,” or EMO for short) has significantly grown in
the last few years, giving rise to a wide variety of algorithms
[7]. As any other research area, EMO currently presents cer-
tain trends. One of them is to improve the efficiency both of
the algorithms and of the data structures used to store nondom-
inated vectors. EMO researchers have produced some clever
techniques to maintain diversity (e.g., the adaptive grid used
by the Pareto Archive Evolutionary Strategy (PAES) [21]), new
algorithms that use very small populations (e.g., the microGA
[6]), and data structures that allow to handle unconstrained ex-
ternal archives (e.g., the dominated tree [12]).
Particle swarm optimization (PSO) is a relatively recent
heuristic inspired by the choreography of a bird flock. PSO has
been found to be successful in a wide variety of optimization
Manuscript received July 8, 2002; revised September 2, 2003. The work
of C. A. Coello Coello was supported in part by CONACyT under Project
34 201-A. G. T. Pulido acknowledges support from CONACyT through a
scholarship to pursue graduate studies at the Computer Science Section of the
Electrical Engineering Department, CINVESTAV-IPN, México. M. S. Lechuga
acknowledges support from CONACyT through a scholarship to pursue grad-
uate studies at the School of Computer Science, University of Birmingham,
Birmingham, U.K.
C. A. Coello Coello and G. T. Pulido are with CINVESTAV-IPN, Sección
de Computación, Departamento de Ing. Eléctrica, Sección de Computación,
computacion.cs.cinvestav.mx).
M. S. Lechuga is with the University of Birmingham, School of
Computer Science, Edgbaston, Birmingham B15 2TT, U.K. (e-mail:
Digital Object Identifier 10.1109/TEVC.2004.826067
tasks [19], but until recently it had not been extended to deal
with multiple objectives.
PSO seems particularly suitable for multiobjective optimiza-
tion mainly because of the high speed of convergence that
the algorithm presents for single-objective optimization [19].
In this paper, we present a proposal, called “multiobjective
particle swarm optimization” (MOPSO), which allows the PSO
algorithm to be able to deal with multiobjective optimization
problems. Our current proposal is an improved version of
the algorithm reported in [5], in which we have added a
constraint-handling mechanism and a mutation operator that
considerably improves the exploratory capabilities of our
original algorithm.
MOPSO is validated using several standard test functions
reported in the specialized literature and compared against
three highly competitive EMO algorithms: the nondominated
sorting genetic algorithm-II [11] (NSGA-II), the PAES [21],
and the microgenetic algorithm for multiobjective optimization
(microGA) [6].
II. B
ASIC CONCEPTS
Definition 1 (Global Minimum): Given a function :
, , for the value is
called a global minimum if and only if
(1)
Then,
is the global minimum solution, is the objective func-
tion, and the set
is the feasible region , where rep-
resents the whole search space.
Definition 2 [General Multiobjective Optimization Problem
(MOP)]: Find the vector
which will
satisfy the
inequality constraints
(2)
the
equality constraints
(3)
and will optimize the vector function
(4)
where
is the vector of decision variables.
Definition 3 (Pareto Optimality): A point
is Pareto
optimal if for every
and either
(5)
or, there is at least one
such that
(6)
1089-778X/04$20.00 © 2004 IEEE