Expert Systems With Applications 183 (2021) 115352
Available online 9 June 2021
0957-4174/© 2021 Elsevier Ltd. All rights reserved.
A new optimization method based on COOT bird natural life model
Iraj Naruei
a
, Farshid Keynia
b
,
*
a
Department Engineering, Kerman Branch, Islamic Azad University, Kerman, Iran
b
Department of Energy Management and Optimization, Institute of Science and High Technology and Environmental Sciences, Graduate University of Advanced
Technology, Kerman, Iran
ARTICLE INFO
Keywords:
Optimization techniques
Meta-heuristic algorithm
Coot Birds
Coot optimization
ABSTRACT
Recently, many intelligent algorithms have been proposed to nd the best solution for complex engineering
problems. These algorithms can search volatile and multi-dimensional solution spaces and nd optimal answers
timely. In this paper, a new meta-heuristic method is proposed that inspires the behavior of the swarm of birds
called Coot. The Coot algorithm imitates two different modes of movement of birds on the water surface: in the
rst phase, the movement of birds is irregular, and in the second phase, the movements are regular. The swarm
moves towards a group of leading leaders to reach a food supply; the movement of the end of the swarm is in the
form of a chain of coots, each of coot which moves behind its front coots. The algorithm then runs on a number of
test functions, and the results are compared with well-known optimization algorithms. In addition, to solve
several real problems, such as Tension/Compression spring, Pressure vessel design, Welded Beam Design, Multi-
plate disc clutch brake, Step-cone pulley problem, Cantilever beam design, reducer design problem, and Rolling
element bearing problem this algorithm is used to conrm the applicability of this algorithm. The results show
that this algorithm is capable to outperform most of the other optimization methods. The source code is currently
available for public from: https://www.mathworks.com/matlabcentral/leexchange/89102-coot-optimization-
algorithm.
1. Introduction
Optimization is the process of nding the best answer or the global
optimal point for a problem. In optimizing the problems, the optimal
global point is the minimum or maximum value of a function. Optimi-
zation issues can be found in all elds of study, which makes optimi-
zation techniques an essential and important direction of study for
researchers. The meta-heuristic algorithms are a kind of random algo-
rithms which used to nd an optimal response. Optimization methods
and algorithms are categorized into two groups of exact algorithms and
approximate algorithms. Exact algorithms are capable of nding the
optimal answer in a precise manner, but they are not efcient enough for
strict optimization problems and their execution time expands expo-
nentially with the dimensions of the problems. Approximate algorithms
are capable of nding good (near optimal) solutions at a short time for
strict optimization problems. Approximate algorithms are divided into
three categories: heuristic algorithms, meta-heuristic and hyper heuristic
algorithms. The two main problems are the innovative algorithms,
catching them at the local optimum points and early convergence into
these points. Meta-heuristic algorithms to solve the decit heuristic
algorithms have been proposed (Spall, 2005). In fact, meta-heuristic
algorithms are one of a kind of approximate optimization algorithms
that have solutions for Escape from Local Optimum points and can be
applied to a wide range of problems. Various categories of these algo-
rithms have been developed in recent decades (Mahdavi et al., 2015).
The capabilities of the meta-heuristic processes can be a simple, exible,
non-inference mechanism and avoid local optimums. Meta-heuristic
processes are inspired by physical phenomena, animal behavior,
evolutionary concepts, and human phenomena. A category of known
algorithms is presented in Fig. 1. Many articles have tried to classify
optimization algorithms based on their inspiration (Ertenlice & Kalayci,
2018; Hussain et al., 2018; Sotoudeh-Anvari & Hafezalkotob, 2018).
An important question arises here, when there are famous algorithms
such as those mentioned above, what is needed to offer and present new
algorithms. According to the NFL theorem (Blum & Roli, 2003; Wolpert
& Macready, 1997), there is no optimization algorithm that can solve all
optimization problems. In fact, the average performance of the opti-
mizers is almost the same. So there are a lot of problems that are not still
well solved notwithstanding the popular optimization algorithms, and
offering new algorithms can solve such problems. This is the motivation
* Corresponding author.
E-mail addresses: irajnaruei@iauk.ac.ir (I. Naruei), f.keynia@kgut.ac.ir (F. Keynia).
Contents lists available at ScienceDirect
Expert Systems With Applications
journal homepage: www.elsevier.com/locate/eswa
https://doi.org/10.1016/j.eswa.2021.115352
Received 19 February 2020; Received in revised form 10 April 2021; Accepted 3 June 2021