Fish School Search (FSS), proposed by Bastos Filho and Lima Neto in 2008 is, in its basic version, [1] an unimodal optimization algorithm inspired on the collective behavior of fish schools. The mechanisms of feeding and coordinated movement were used as inspiration to create the search operators. The core idea is to make the fishes “swim” toward the positive gradient in order to “eat” and “gain weight”. Collectively, the heavier fishes have more influence on the search process as a whole, what makes the barycenter of the fish school moves toward better places in the search space over the iterations. [2]
The FSS uses the following principles: [3]
FSS is a population based search algorithm inspired in the behavior of swimming fishes that expand and contract while looking for food. Each fish -dimensional location represents a possible solution for the optimization problem. The algorithm makes use of weights for all the fishes which represents cumulative account on how successful has been the search for each fish in the school. FSS is composed of the feeding and movement operators, the latter being divided into three sub-components, which are: [4]
Every fish in the school performs a local search looking for promising regions in the search space. It is done as represented below:
where and represent the position of the fish before and after the individual movement operator, respectively. is a uniformly distributed random number varying from -1 up to 1 and is a parameter that defines the maximum displacement for this movement. The new position is only accepted if the fitness of the fish improves with the position change. If it is not the case, the fish remains in the same position and .
An average of the individual movements is calculated based on the following:
The vector represents the weighted average of the displacements of each fish. It means that the fishes that experienced a higher improvement will attract fishes into its position. After the vector computation, every fish will be encouraged to move according to:
This operator is used in order to regulate the exploration/exploitation ability of the school during the search process. First of all, the barycenter of the school is calculated based on the position and the weight of each fish:
and then, if the total school weight has increased from the last to the current iteration, the fishes are attracted to the barycenter according to equation A. If the total school weight has not improved, the fishes are spread away from the barycenter according to equation B:
Eq. A:
Eq. B:
where defines the size of the maximum displacement performed with the use of this operator. is the euclidean distance between the fish position and the school barycenter. is a uniformly distributed random number varying from 0 up to 1.
Besides the movement operators, it was also defined a feeding operator used in order to update the weights of every fish according to:
where is the weight parameter for fish , is the fitness variation between the last and the new position, and represents the maximum absolute value of the fitness variation among all the fishes in the school. is only allowed to vary from 1 up to , which is a user defined attribute. The weights of all fishes are initialized with the value .
The parameters and decay linearly according to:
and similarly:
where and are user defined initial values for and , respectively. is the maximum number of iterations allowed in the search process.
This version excels for multimodal hyper-dimensional functions. It includes modifications in the previous operators: Feeding and Swimming, as well as new: Memory and Partition operators. The latter two were introduced to account for the partition of the main school into subgroups. Some changes were also included in the stop conditions that now also have to consider subswarms. [5]
wFSS is a weight based niching version of FSS intended to produce multiple solutions. The niching strategy is based on a new operator called link formator. This operator is used to define leaders for the fishes in order to form sub-schools. [6]
In the original version of the algorithm, the individual movement component is only allowed to move a fish if it improves the fitness. However, in a very smooth search space, there would be many moving trials with no success and the algorithm could fail to converge. To solve these issues, was introduced a parameter X for which 0 <= X <= 1 in the individual component of the movement. X decays exponentially along with the iterations and measures a probability for a worsening allowance for each fish. It means that, every time a fish tries to move to a position that does not improve its fitness, a random number is chosen and if it is smaller than X the movement is allowed. [7]
The bFSS intended to cope with premature convergence. Proposing the use of a binary encoding scheme for the internal mechanisms of the fish school search. It combined the FSS with fuzzy modeling in a wrapper approach for Feature Selection. [8]
In the MOFSS the operators are adapted to solve multi-objective problems. The algorithm deploys an External Archive to store the best non-dominated solutions found during the search process. This approach has been extensively used for different bio-inspired multiobjective optimizers. [9] [10] Furthermore, the solutions within the External Archive are used to guide the fish movements in the proposal version. [11]
The knapsack problem is the following problem in combinatorial optimization:
In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.
In computational science, particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. It solves a problem by having a population of candidate solutions, here dubbed particles, and moving these particles around in the search-space according to simple mathematical formulae over the particle's position and velocity. Each particle's movement is influenced by its local best known position, but is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions.
A fitness function is a particular type of objective function that is used to summarise, as a single figure of merit, how close a given design solution is to achieving the set aims. Fitness functions are used in software architecture and evolutionary algorithms (EA), such as genetic programming and genetic algorithms to guide simulations towards optimal design solutions.
Branch and bound is a method for solving optimization problems by breaking them down into smaller sub-problems and using a bounding function to eliminate sub-problems that cannot contain the optimal solution. It is an algorithm design paradigm for discrete and combinatorial optimization problems, as well as mathematical optimization. A branch-and-bound algorithm consists of a systematic enumeration of candidate solutions by means of state space search: the set of candidate solutions is thought of as forming a rooted tree with the full set at the root. The algorithm explores branches of this tree, which represent subsets of the solution set. Before enumerating the candidate solutions of a branch, the branch is checked against upper and lower estimated bounds on the optimal solution, and is discarded if it cannot produce a better solution than the best one found so far by the algorithm.
In computer science and operations research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems that can be reduced to finding good paths through graphs. Artificial ants represent multi-agent methods inspired by the behavior of real ants. The pheromone-based communication of biological ants is often the predominant paradigm used. Combinations of artificial ants and local search algorithms have become a preferred method for numerous optimization tasks involving some sort of graph, e.g., vehicle routing and internet routing.
In machine learning, feature selection is the process of selecting a subset of relevant features for use in model construction. Feature selection techniques are used for several reasons:
Stochastic gradient descent is an iterative method for optimizing an objective function with suitable smoothness properties. It can be regarded as a stochastic approximation of gradient descent optimization, since it replaces the actual gradient by an estimate thereof. Especially in high-dimensional optimization problems this reduces the very high computational burden, achieving faster iterations in exchange for a lower convergence rate.
Recurrent neural networks (RNNs) are a class of artificial neural network commonly used for sequential data processing. Unlike feedforward neural networks, which process data in a single pass, RNNs process data across multiple time steps, making them well-adapted for modelling and processing text, speech, and time series.
Estimation of distribution algorithms (EDAs), sometimes called probabilistic model-building genetic algorithms (PMBGAs), are stochastic optimization methods that guide the search for the optimum by building and sampling explicit probabilistic models of promising candidate solutions. Optimization is viewed as a series of incremental updates of a probabilistic model, starting with the model encoding an uninformative prior over admissible solutions and ending with the model that generates only the global optima.
A Block Matching Algorithm is a way of locating matching macroblocks in a sequence of digital video frames for the purposes of motion estimation. The underlying supposition behind motion estimation is that the patterns corresponding to objects and background in a frame of video sequence move within the frame to form corresponding objects on the subsequent frame. This can be used to discover temporal redundancy in the video sequence, increasing the effectiveness of inter-frame video compression by defining the contents of a macroblock by reference to the contents of a known macroblock which is minimally different.
A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm (GA) or more general evolutionary algorithm (EA). It may provide a sufficiently good solution to an optimization problem. It uses a suitable heuristic or local search technique to improve the quality of solutions generated by the EA and to reduce the likelihood of premature convergence.
Limited-memory BFGS is an optimization algorithm in the family of quasi-Newton methods that approximates the Broyden–Fletcher–Goldfarb–Shanno algorithm (BFGS) using a limited amount of computer memory. It is a popular algorithm for parameter estimation in machine learning. The algorithm's target problem is to minimize over unconstrained values of the real-vector where is a differentiable scalar function.
Covariance matrix adaptation evolution strategy (CMA-ES) is a particular kind of strategy for numerical optimization. Evolution strategies (ES) are stochastic, derivative-free methods for numerical optimization of non-linear or non-convex continuous optimization problems. They belong to the class of evolutionary algorithms and evolutionary computation. An evolutionary algorithm is broadly based on the principle of biological evolution, namely the repeated interplay of variation and selection: in each generation (iteration) new individuals are generated by variation of the current parental individuals, usually in a stochastic way. Then, some individuals are selected to become the parents in the next generation based on their fitness or objective function value . Like this, individuals with better and better -values are generated over the generation sequence.
Multi-objective optimization or Pareto optimization is an area of multiple-criteria decision making that is concerned with mathematical optimization problems involving more than one objective function to be optimized simultaneously. Multi-objective is a type of vector optimization that has been applied in many fields of science, including engineering, economics and logistics where optimal decisions need to be taken in the presence of trade-offs between two or more conflicting objectives. Minimizing cost while maximizing comfort while buying a car, and maximizing performance whilst minimizing fuel consumption and emission of pollutants of a vehicle are examples of multi-objective optimization problems involving two and three objectives, respectively. In practical problems, there can be more than three objectives.
In operations research, cuckoo search is an optimization algorithm developed by Xin-She Yang and Suash Deb in 2009. It has been shown to be a special case of the well-known -evolution strategy. It was inspired by the obligate brood parasitism of some cuckoo species by laying their eggs in the nests of host birds of other species. Some host birds can engage direct conflict with the intruding cuckoos. For example, if a host bird discovers the eggs are not their own, it will either throw these alien eggs away or simply abandon its nest and build a new nest elsewhere. Some cuckoo species such as the New World brood-parasitic Tapera have evolved in such a way that female parasitic cuckoos are often very specialized in the mimicry in colors and pattern of the eggs of a few chosen host species. Cuckoo search idealized such breeding behavior, and thus can be applied for various optimization problems.
The Fly Algorithm is a computational method within the field of evolutionary algorithms, designed for direct exploration of 3D spaces in applications such as computer stereo vision, robotics, and medical imaging. Unlike traditional image-based stereovision, which relies on matching features to construct 3D information, the Fly Algorithm operates by generating a 3D representation directly from random points, termed "flies." Each fly is a coordinate in 3D space, evaluated for its accuracy by comparing its projections in a scene. By iteratively refining the positions of flies based on fitness criteria, the algorithm can construct an optimized spatial representation. The Fly Algorithm has expanded into various fields, including applications in digital art, where it is used to generate complex visual patterns.
Quantum optimization algorithms are quantum algorithms that are used to solve optimization problems. Mathematical optimization deals with finding the best solution to a problem from a set of possible solutions. Mostly, the optimization problem is formulated as a minimization problem, where one tries to minimize an error which depends on the solution: the optimal solution has the minimal error. Different optimization techniques are applied in various fields such as mechanics, economics and engineering, and as the complexity and amount of data involved rise, more efficient ways of solving optimization problems are needed. Quantum computing may allow problems which are not practically feasible on classical computers to be solved, or suggest a considerable speed up with respect to the best known classical algorithm.
In evolutionary computation, Minimum Population Search (MPS) is a computational method that optimizes a problem by iteratively trying to improve a set of candidate solutions with regard to a given measure of quality. It solves a problem by evolving a small population of candidate solutions by means of relatively simple arithmetical operations.
In mathematics, the spiral optimization (SPO) algorithm is a metaheuristic inspired by spiral phenomena in nature.