Applied
Soft
Computing
30
(2015)
529–548
Contents
lists
available
at
ScienceDirect
Applied
Soft
Computing
j
ourna
l
h
o
mepage:
www.elsevier.com/locate/asoc
A
directional
mutation
operator
for
differential
evolution
algorithms
Xin
Zhang
a
,
Shiu
Yin
Yuen
b,∗
a
College
of
Electronic
and
Communication
Engineering,
Tianjin
Normal
University,
Tianjin,
China
b
Department
of
Electronic
Engineering,
City
University
of
Hong
Kong,
Hong
Kong,
China
a
r
t
i
c
l
e
i
n
f
o
Article
history:
Received
26
June
2012
Received
in
revised
form
3
February
2015
Accepted
3
February
2015
Available
online
11
February
2015
Keywords:
Differential
evolution
Directional
mutation
Generic
mutation
operator
Global
numerical
optimization
a
b
s
t
r
a
c
t
Differential
evolution
(DE)
is
widely
studied
in
the
past
decade.
In
its
mutation
operator,
the
random
variations
are
derived
from
the
difference
of
two
randomly
selected
different
individuals.
Difference
vec-
tor
plays
an
important
role
in
evolution.
It
is
observed
that
the
best
fitness
found
so
far
by
DE
cannot
be
improved
in
every
generation.
In
this
article,
a
directional
mutation
operator
is
proposed.
It
attempts
to
recognize
good
variation
directions
and
increase
the
number
of
generations
having
fitness
improvement.
The
idea
is
to
construct
a
pool
of
difference
vectors
calculated
when
fitness
is
improved
at
a
generation.
The
difference
vector
pool
will
guide
the
mutation
search
in
the
next
generation
once
only.
The
directional
mutation
operator
can
be
applied
into
any
DE
mutation
strategy.
The
purpose
is
to
speed
up
the
conver-
gence
of
DE
and
improve
its
performance.
The
proposed
method
is
evaluated
experimentally
on
CEC
2005
test
set
with
dimension
30
and
on
CEC
2008
test
set
with
dimensions
100
and
1000.
It
is
demonstrated
that
the
proposed
method
can
result
in
a
larger
number
of
generations
having
fitness
improvement
than
classic
DE.
It
is
combined
with
eleven
DE
algorithms
as
examples
of
how
to
combine
with
other
algo-
rithms.
After
its
incorporation,
the
performance
of
most
of
these
DE
algorithms
is
significantly
improved.
Moreover,
simulation
results
show
that
the
directional
mutation
operator
is
helpful
for
balancing
the
exploration
and
exploitation
capacity
of
the
tested
DE
algorithms.
Furthermore,
the
directional
mutation
operator
modifications
can
save
computational
time
compared
to
the
original
algorithms.
The
proposed
approach
is
compared
with
the
proximity
based
mutation
operator
as
both
are
claimed
to
be
applicable
to
any
DE
mutation
strategy.
The
directional
mutation
operator
is
shown
to
be
better
than
the
proximity
based
mutation
operator
on
the
five
variants
in
the
DE
family.
Finally,
the
applications
of
two
real
world
engineering
optimization
problems
verify
the
usefulness
of
the
proposed
method.
©
2015
Elsevier
B.V.
All
rights
reserved.
1.
Introduction
Evolutionary
algorithms
(EAs)[1,2]
are
inspired
from
natural
evolution
of
species.
The
procedures
of
EAs
parallel
those
in
the
evo-
lutionary
process
of
species.
Typically,
EAs
are
population-based
and
depend
on
variation
operators
and
survivor
selection
to
realize
the
evolutionary
process.
They
only
assume
that
function
values
can
be
obtained
given
a
feasible
solution.
There
is
no
assumption
about
the
explicit
expression
or
differentiability
of
the
function.
In
EAs,
the
function
to
be
optimized
is
often
called
fitness
functions;
the
domain
of
variables
called
search
space;
a
feasible
solution
called
individual
and
the
function
value
of
a
feasible
solution
called
fitness
or
fitness
value.
In
practice,
EAs
have
been
applied
to
many
fields
such
as
engineering
design[3],
energy
management[4],
∗
Corresponding
author.
Tel.:
+852
34427717.
E-mail
addresses:
xinzhang9-c@my.cityu.edu.hk
(X.
Zhang),
kelviny.ee@cityu.edu.hk
(S.Y.
Yuen).
financial
strategies
[5]
and
computer
vision
[6]
etc.
These
applica-
tions
justify
the
usefulness
of
EAs.
Generally,
the
study
of
EAs
is
targeted
to
propose
an
algorithm
that
is
applicable
to
a
class
of
problems,
is
computationally
efficient
and
converges
quickly
to
the
global
optimum.
Several
popular
EAs
are
genetic
algorithm
(GA)[7],
genetic
programming
(GP),
evolu-
tion
strategies
(ES),
evolutionary
programming
(EP)
and
differential
evolution
(DE).
Note
that
some
researchers
consider
DE
as
a
swarm
intelligence
algorithm.
The
classic
version
of
DE
is
simple
to
imple-
ment,
ea
sy
to
use
and
fast.
Although
some
classic
EAs
are
easy
to
be
programmed
and
computationally
efficient,
yet
the
classic
version
of
EA
is
often
stuck
in
a
local
optimum
of
fitness
functions.
Hence,
numerous
researches
are
proposed
to
balance
the
exploitation
and
exploration
search
process
of
EAs.
A
good
and
robust
EA
should
not
only
have
a
fast
convergence
rate,
but
can
also
reach
the
global
optimum
for
complex
fitness
function
with
many
local
optima.
DE,
proposed
in
the
mid-1990s,
has
been
extensively
studied.
Many
variants
of
DE
are
reported
in
the
past
decade.
The
paradigm
of
DE
is
shown
to
be
very
powerful.
For
example,
it
secures
com-
petitive
rankings
in
all
IEEE
Congress
on
Evolutionary
Computation
http://dx.doi.org/10.1016/j.asoc.2015.02.005
1568-4946/©
2015
Elsevier
B.V.
All
rights
reserved.