Fish School Search

Last updated

Fish School Search (FSS), proposed by Bastos Filho and Lima Neto in 2008 is, in its basic version, [1] an unimodal optimization algorithm inspired on the collective behavior of fish schools. The mechanisms of feeding and coordinated movement were used as inspiration to create the search operators. The core idea is to make the fishes “swim” toward the positive gradient in order to “eat” and “gain weight”. Collectively, the heavier fishes have more influence on the search process as a whole, what makes the barycenter of the fish school moves toward better places in the search space over the iterations. [2]

Contents

The FSS uses the following principles: [3]

  1. Simple computations in all individuals (i.e. fish)
  2. Various means of storing information (i.e. weights of fish and school barycenter)
  3. Local computations (i.e. swimming is composed of distinct components)
  4. Low communications between neighboring individuals (i.e. fish are to think local but also be socially aware)
  5. Minimum centralized control (mainly for self-controlling of the school radius)
  6. Some distinct diversity mechanisms (this to avoid undesirable flocking behavior)
  7. Scalability (in terms of complexity of the optimization/search tasks)
  8. Autonomy (i.e. ability to self-control functioning)

Algorithm

FSS is a population based search algorithm inspired in the behavior of swimming fishes that expand and contract while looking for food. Each fish -dimensional location represents a possible solution for the optimization problem. The algorithm makes use of weights for all the fishes which represents cumulative account on how successful has been the search for each fish in the school. FSS is composed of the feeding and movement operators, the latter being divided into three sub-components, which are: [4]

Individual component of the movement

Every fish in the school performs a local search looking for promising regions in the search space. It is done as represented below:

where and represent the position of the fish before and after the individual movement operator, respectively. is a uniformly distributed random number varying from -1 up to 1 and is a parameter that defines the maximum displacement for this movement. The new position is only accepted if the fitness of the fish improves with the position change. If it is not the case, the fish remains in the same position and .

Collective-instinctive component of the movement

An average of the individual movements is calculated based on the following:

The vector represents the weighted average of the displacements of each fish. It means that the fishes that experienced a higher improvement will attract fishes into its position. After the vector computation, every fish will be encouraged to move according to:

Collective-volitive component of the movement

This operator is used in order to regulate the exploration/exploitation ability of the school during the search process. First of all, the barycenter of the school is calculated based on the position and the weight of each fish:

and then, if the total school weight has increased from the last to the current iteration, the fishes are attracted to the barycenter according to equation A. If the total school weight has not improved, the fishes are spread away from the barycenter according to equation B:

Eq. A:

Eq. B:

where defines the size of the maximum displacement performed with the use of this operator. is the euclidean distance between the fish position and the school barycenter. is a uniformly distributed random number varying from 0 up to 1.

Besides the movement operators, it was also defined a feeding operator used in order to update the weights of every fish according to:

where is the weight parameter for fish , is the fitness variation between the last and the new position, and represents the maximum absolute value of the fitness variation among all the fishes in the school. is only allowed to vary from 1 up to , which is a user defined attribute. The weights of all fishes are initialized with the value .

The pseudo-code for FSS

  1. Initialize user parameters
  2. Initialize fishes positions randomly
  3. while Stopping condition is not met do
  4. Calculate fitness for each fish
  5. Run individual operator movement
  6. Calculate fitness for each fish
  7. Run feeding operator
  8. Run collective-instinctive movement operator
  9. Run collective-volitive movement operator
  10. end while

The parameters and decay linearly according to:

and similarly:

where and are user defined initial values for and , respectively. is the maximum number of iterations allowed in the search process.

Variations of FSS

dFSS (Density-based Fish School Search)

This version excels for multimodal hyper-dimensional functions. It includes modifications in the previous operators: Feeding and Swimming, as well as new: Memory and Partition operators. The latter two were introduced to account for the partition of the main school into subgroups. Some changes were also included in the stop conditions that now also have to consider subswarms. [5]

wFSS (Weight-based Fish School Search)

wFSS is a weight based niching version of FSS intended to produce multiple solutions. The niching strategy is based on a new operator called link formator. This operator is used to define leaders for the fishes in order to form sub-schools. [6]

FSS-SAR (Stagnation Avoidance Routine Fish School Search)

In the original version of the algorithm, the individual movement component is only allowed to move a fish if it improves the fitness. However, in a very smooth search space, there would be many moving trials with no success and the algorithm could fail to converge. To solve these issues, was introduced a parameter X for which 0 <= X <= 1 in the individual component of the movement. X decays exponentially along with the iterations and measures a probability for a worsening allowance for each fish. It means that, every time a fish tries to move to a position that does not improve its fitness, a random number is chosen and if it is smaller than X the movement is allowed. [7]

bFSS (Binary Fish School Search)

The bFSS intended to cope with premature convergence. Proposing the use of a binary encoding scheme for the internal mechanisms of the fish school search. It combined the FSS with fuzzy modeling in a wrapper approach for Feature Selection. [8]

MOFSS (Multi-Objective Fish School Search)

In the MOFSS the operators are adapted to solve multi-objective problems. The algorithm deploys an External Archive to store the best non-dominated solutions found during the search process. This approach has been extensively used for different bio-inspired multiobjective optimizers. [9] [10] Furthermore, the solutions within the External Archive are used to guide the fish movements in the proposal version. [11]

See also

Related Research Articles

<span class="mw-page-title-main">Knapsack problem</span> Problem in combinatorial optimization

The knapsack problem is the following problem in combinatorial optimization:

In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.

<span class="mw-page-title-main">Particle swarm optimization</span> Iterative simulation method

In computational science, particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. It solves a problem by having a population of candidate solutions, here dubbed particles, and moving these particles around in the search-space according to simple mathematical formulae over the particle's position and velocity. Each particle's movement is influenced by its local best known position, but is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions.

A fitness function is a particular type of objective function that is used to summarise, as a single figure of merit, how close a given design solution is to achieving the set aims. Fitness functions are used in software architecture and evolutionary algorithms (EA), such as genetic programming and genetic algorithms to guide simulations towards optimal design solutions.

Branch and bound is a method for solving optimization problems by breaking them down into smaller sub-problems and using a bounding function to eliminate sub-problems that cannot contain the optimal solution. It is an algorithm design paradigm for discrete and combinatorial optimization problems, as well as mathematical optimization. A branch-and-bound algorithm consists of a systematic enumeration of candidate solutions by means of state space search: the set of candidate solutions is thought of as forming a rooted tree with the full set at the root. The algorithm explores branches of this tree, which represent subsets of the solution set. Before enumerating the candidate solutions of a branch, the branch is checked against upper and lower estimated bounds on the optimal solution, and is discarded if it cannot produce a better solution than the best one found so far by the algorithm.

<span class="mw-page-title-main">Ant colony optimization algorithms</span> Optimization algorithm

In computer science and operations research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems that can be reduced to finding good paths through graphs. Artificial ants represent multi-agent methods inspired by the behavior of real ants. The pheromone-based communication of biological ants is often the predominant paradigm used. Combinations of artificial ants and local search algorithms have become a preferred method for numerous optimization tasks involving some sort of graph, e.g., vehicle routing and internet routing.

In machine learning, feature selection is the process of selecting a subset of relevant features for use in model construction. Feature selection techniques are used for several reasons:

Stochastic gradient descent is an iterative method for optimizing an objective function with suitable smoothness properties. It can be regarded as a stochastic approximation of gradient descent optimization, since it replaces the actual gradient by an estimate thereof. Especially in high-dimensional optimization problems this reduces the very high computational burden, achieving faster iterations in exchange for a lower convergence rate.

Recurrent neural networks (RNNs) are a class of artificial neural network commonly used for sequential data processing. Unlike feedforward neural networks, which process data in a single pass, RNNs process data across multiple time steps, making them well-adapted for modelling and processing text, speech, and time series.

<span class="mw-page-title-main">Estimation of distribution algorithm</span> Family of stochastic optimization methods

Estimation of distribution algorithms (EDAs), sometimes called probabilistic model-building genetic algorithms (PMBGAs), are stochastic optimization methods that guide the search for the optimum by building and sampling explicit probabilistic models of promising candidate solutions. Optimization is viewed as a series of incremental updates of a probabilistic model, starting with the model encoding an uninformative prior over admissible solutions and ending with the model that generates only the global optima.

<span class="mw-page-title-main">Block-matching algorithm</span> System used in computer graphics applications

A Block Matching Algorithm is a way of locating matching macroblocks in a sequence of digital video frames for the purposes of motion estimation. The underlying supposition behind motion estimation is that the patterns corresponding to objects and background in a frame of video sequence move within the frame to form corresponding objects on the subsequent frame. This can be used to discover temporal redundancy in the video sequence, increasing the effectiveness of inter-frame video compression by defining the contents of a macroblock by reference to the contents of a known macroblock which is minimally different.

A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm (GA) or more general evolutionary algorithm (EA). It may provide a sufficiently good solution to an optimization problem. It uses a suitable heuristic or local search technique to improve the quality of solutions generated by the EA and to reduce the likelihood of premature convergence.

Limited-memory BFGS is an optimization algorithm in the family of quasi-Newton methods that approximates the Broyden–Fletcher–Goldfarb–Shanno algorithm (BFGS) using a limited amount of computer memory. It is a popular algorithm for parameter estimation in machine learning. The algorithm's target problem is to minimize over unconstrained values of the real-vector where is a differentiable scalar function.

Covariance matrix adaptation evolution strategy (CMA-ES) is a particular kind of strategy for numerical optimization. Evolution strategies (ES) are stochastic, derivative-free methods for numerical optimization of non-linear or non-convex continuous optimization problems. They belong to the class of evolutionary algorithms and evolutionary computation. An evolutionary algorithm is broadly based on the principle of biological evolution, namely the repeated interplay of variation and selection: in each generation (iteration) new individuals are generated by variation of the current parental individuals, usually in a stochastic way. Then, some individuals are selected to become the parents in the next generation based on their fitness or objective function value . Like this, individuals with better and better -values are generated over the generation sequence.

Multi-objective optimization or Pareto optimization is an area of multiple-criteria decision making that is concerned with mathematical optimization problems involving more than one objective function to be optimized simultaneously. Multi-objective is a type of vector optimization that has been applied in many fields of science, including engineering, economics and logistics where optimal decisions need to be taken in the presence of trade-offs between two or more conflicting objectives. Minimizing cost while maximizing comfort while buying a car, and maximizing performance whilst minimizing fuel consumption and emission of pollutants of a vehicle are examples of multi-objective optimization problems involving two and three objectives, respectively. In practical problems, there can be more than three objectives.

In operations research, cuckoo search is an optimization algorithm developed by Xin-She Yang and Suash Deb in 2009. It has been shown to be a special case of the well-known -evolution strategy. It was inspired by the obligate brood parasitism of some cuckoo species by laying their eggs in the nests of host birds of other species. Some host birds can engage direct conflict with the intruding cuckoos. For example, if a host bird discovers the eggs are not their own, it will either throw these alien eggs away or simply abandon its nest and build a new nest elsewhere. Some cuckoo species such as the New World brood-parasitic Tapera have evolved in such a way that female parasitic cuckoos are often very specialized in the mimicry in colors and pattern of the eggs of a few chosen host species. Cuckoo search idealized such breeding behavior, and thus can be applied for various optimization problems.

The Fly Algorithm is a computational method within the field of evolutionary algorithms, designed for direct exploration of 3D spaces in applications such as computer stereo vision, robotics, and medical imaging. Unlike traditional image-based stereovision, which relies on matching features to construct 3D information, the Fly Algorithm operates by generating a 3D representation directly from random points, termed "flies." Each fly is a coordinate in 3D space, evaluated for its accuracy by comparing its projections in a scene. By iteratively refining the positions of flies based on fitness criteria, the algorithm can construct an optimized spatial representation. The Fly Algorithm has expanded into various fields, including applications in digital art, where it is used to generate complex visual patterns.

Quantum optimization algorithms are quantum algorithms that are used to solve optimization problems. Mathematical optimization deals with finding the best solution to a problem from a set of possible solutions. Mostly, the optimization problem is formulated as a minimization problem, where one tries to minimize an error which depends on the solution: the optimal solution has the minimal error. Different optimization techniques are applied in various fields such as mechanics, economics and engineering, and as the complexity and amount of data involved rise, more efficient ways of solving optimization problems are needed. Quantum computing may allow problems which are not practically feasible on classical computers to be solved, or suggest a considerable speed up with respect to the best known classical algorithm.

In evolutionary computation, Minimum Population Search (MPS) is a computational method that optimizes a problem by iteratively trying to improve a set of candidate solutions with regard to a given measure of quality. It solves a problem by evolving a small population of candidate solutions by means of relatively simple arithmetical operations.

<span class="mw-page-title-main">Spiral optimization algorithm</span> Optimization algorithm

In mathematics, the spiral optimization (SPO) algorithm is a metaheuristic inspired by spiral phenomena in nature.

References

  1. C. J. A. B Filho., F. B. de Lima Neto, A. J. C. C.. Lins, A. I. S. Nascimento., and M. P. Lima, "A novel search algorithm based on fish school behavior," Systems, Man and Cybernetics, SMC 2008. IEEE International Conference on, 2008, pp. 2646-2651.
  2. de Lima Neto, Fernando Buarque, and Marcelo Gomes Pereira de Lacerda. "Multimodal Fish School Search Algorithms Based on Local Information for School Splitting." 2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence. IEEE, 2013
  3. "FBLN – Prof. Dr. Fernando Buarque de Lima Neto, B.Sc. M.Sc. DIC-Ph.D.(Imperial College/UK) Hab(BR) SM-IEEE(USA) Alexander von Humboldt-Fellow(DE) Academy of Science-Fellow(PE/BR)".
  4. J. B. Monteiro, I. M. C. Albuquerque, F. B. L. Neto, and F. V. S. Ferreira, “Optimizing multi-plateau functions with FSS-SAR (Stagnation Avoidance Routine),” Submitted to IEEE Symposium Series on Computational Intelligence, 2016.
  5. Madeiro, S. S., de Lima-Neto, F. B., Bastos-Filho, C. J. A., & do Nascimento Figueiredo, E. M. (2011, June). Density as the segregation mechanism in fish school search for multimodal optimization problems. In International Conference in Swarm Intelligence (pp. 563-572). Springer Berlin Heidelberg.
  6. F. Buarque De Lima Neto and M. Gomes Pereira de Lacerda, “Weight based fish school search,” in Systems, Man and Cybernetics (SMC), 2014 IEEE International Conference on. IEEE, 2014, pp. 270–277.
  7. J. B. Monteiro, I. M. C. Albuquerque, F. B. L. Neto, and F. V. S. Ferreira, “Optimizing multi-plateau functions with FSS-SAR (Stagnation Avoidance Routine),” Submitted to IEEE Symposium Series on Computational Intelligence, 2016.
  8. Sargo, João AG, et al. "Binary Fish School Search applied to feature selection: Application to ICU readmissions." 2014 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE). IEEE, 2014.
  9. Deb, K., Thiele, L., Laumanns, M., & Zitzler, E.(2002) Scalable Multi-Objective Optimization Test Problems, In: IEEE Congress on Evolutionary Computation (pp. 825–830).
  10. Nebro, A. J., Durillo, J. J., Garça-Nieto, J., Coello Coello, C. A., Luna, F., & Alba, E. (2009) SMPSO: A new PSO-based metaheuristic for multi-objective optimization [ dead link ], In: IEEE Symposium on Computational Intelligence in Multicriteria Decision-Making (pp. 66–73). doi:10.1109/MCDM.2009.4938830
  11. Bastos-Filho, Carmelo JA, and Augusto CS Guimarães. "Multi-Objective Fish School Search." International Journal of Swarm Intelligence Research (IJSIR) 6.1 (2015): 23-40.