Fly algorithm

Last updated

History

The Fly Algorithm is a type of cooperative coevolution based on the Parisian approach. [1] The Fly Algorithm has first been developed in 1999 in the scope of the application of Evolutionary algorithms to computer stereo vision. [2] [3] Unlike the classical image-based approach to stereovision, which extracts image primitives then matches them in order to obtain 3-D information, the Fly Agorithm is based on the direct exploration of the 3-D space of the scene. A fly is defined as a 3-D point described by its coordinates (x, y, z). Once a random population of flies has been created in a search space corresponding to the field of view of the cameras, its evolution (based on the Evolutionary Strategy paradigm) used a fitness function that evaluates how likely the fly is lying on the visible surface of an object, based on the consistency of its image projections. To this end, the fitness function uses the grey levels, colours and/or textures of the calculated fly's projections.

Contents

The first application field of the Fly Algorithm has been stereovision. [2] [3] [4] [5] While classical `image priority' approaches use matching features from the stereo images in order to build a 3-D model, the Fly Algorithm directly explores the 3-D space and uses image data to evaluate the validity of 3-D hypotheses. A variant called the "Dynamic Flies" defines the fly as a 6-uple (x, y, z, x’, y’, z’) involving the fly's velocity. [6] [7] The velocity components are not explicitly taken into account in the fitness calculation but are used in the flies' positions updating and are subject to similar genetic operators (mutation, crossover).

The application of Flies to obstacle avoidance in vehicles [8] exploits the fact that the population of flies is a time compliant, quasi-continuously evolving representation of the scene to directly generate vehicle control signals from the flies. The use of the Fly Algorithm is not strictly restricted to stereo images, as other sensors may be added (e.g. acoustic proximity sensors, etc.) as additional terms to the fitness function being optimised. Odometry information can also be used to speed up the updating of flies' positions, and conversely the flies positions can be used to provide localisation and mapping information. [9]

Another application field of the Fly Algorithm is reconstruction for emission Tomography in nuclear medicine. The Fly Algorithm has been successfully applied in single-photon emission computed tomography [10] and positron emission tomography [11] . [12] Here, each fly is considered a photon emitter and its fitness is based on the conformity of the simulated illumination of the sensors with the actual pattern observed on the sensors. Within this application, the fitness function has been re-defined to use the new concept of 'marginal evaluation'. Here, the fitness of one individual is calculated as its (positive or negative) contribution to the quality of the global population. It is based on the leave-one-out cross-validation principle. A global fitness function evaluates the quality of the population as a whole; only then the fitness of an individual (a fly) is calculated as the difference between the global fitness values of the population with and without the particular fly whose individual fitness function has to be evaluated. [13] [14] In [15] the fitness of each fly is considered as a `level of confidence'. It is used during the voxelisation process to tweak the fly's individual footprint using implicit modelling (such as metaballs). It produces smooth results that are more accurate.

More recently it has been used in digital art to generate mosaic-like images or spray paint. [16] Examples of images can be found on YouTube

Parisian evolution

Here, the population of individuals is considered as a society where the individuals collaborate toward a common goal. This is implemented using an evolutionary algorithm that includes all the common genetic operators (e.g. mutation, cross-over, selection). The main difference is in the fitness function. Here two levels of fitness function are used:

In addition, a diversity mechanism is required to avoid individuals gathering in only a few areas of the search space. Another difference is in the extraction of the problem solution once the evolutionary loop terminates. In classical evolutionary approaches, the best individual corresponds to the solution and the rest of the population is discarded. Here, all the individuals (or individuals of a sub-group of the population) are collated to build the problem solution. The way the fitness functions are constructed and the way the solution extraction is made are of course problem-dependent.

Examples of Parisian Evolution applications include:

Disambiguation

Parisian approach vs cooperative coevolution

Cooperative coevolution is a broad class of evolutionary algorithms where a complex problem is solved by decomposing it into subcomponents that are solved independently. The Parisian approach shares many similarities with the cooperative coevolutionary algorithm. The Parisian approach makes use of a single-population whereas multi-species may be used in cooperative coevolutionary algorithm. Similar internal evolutionary engines are considered in classical evolutionary algorithm, cooperative coevolutionary algorithm and Parisian evolution. The difference between cooperative coevolutionary algorithm and Parisian evolution resides in the population's semantics. Cooperative coevolutionary algorithm divides a big problem into sub-problems (groups of individuals) and solves them separately toward the big problem. [17] There is no interaction/breeding between individuals of the different sub-populations, only with individuals of the same sub-population. However, Parisian evolutionary algorithms solve a whole problem as a big component. All population's individuals cooperate together to drive the whole population toward attractive areas of the search space.

Fly Algorithm vs particle swarm optimisation

Cooperative coevolution and particle swarm optimisation (PSO) share many similarities. PSO is inspired by the social behaviour of bird flocking or fish schooling. [18] [19] It was initially introduced as a tool for realistic animation in computer graphics. It uses complex individuals that interact with each other in order to build visually realistic collective behaviours through adjusting the individuals' behavioural rules (which may use random generators). In mathematical optimisation, every particle of the swarm somehow follows its own random path biased toward the best particle of the swarm. In the Fly Algorithm, the flies aim at building spatial representations of a scene from actual sensor data; flies do not communicate or explicitly cooperate, and do not use any behavioural model.

Both algorithms are search methods that start with a set of random solutions, which are iteratively corrected toward a global optimum. However, the solution of the optimisation problem in the Fly Algorithm is the population (or a subset of the population): The flies implicitly collaborate to build the solution. In PSO the solution is a single particle, the one with the best fitness. Another main difference between the Fly Algorithm and with PSO is that the Fly Algorithm is not based on any behavioural model but only builds a geometrical representation.

Applications of the Fly algorithnm


Example: Tomography reconstruction

Sinogram-hot spheres.png
Sinogram of , which is known.
Example of reconstruction of a hot rod phantom using the Fly Algorithm.

Tomography reconstruction is an inverse problem that is often ill-posed due to missing data and/or noise. The answer to the inverse problem is not unique, and in case of extreme noise level it may not even exist. The input data of a reconstruction algorithm may be given as the Radon transform or sinogram of the data to reconstruct . is unknown; is known. The data acquisition in tomography can be modelled as:

where is the system matrix or projection operator and corresponds to some Poisson noise. In this case the reconstruction corresponds to the inversion of the Radon transform:

Note that can account for noise, acquisition geometry, etc. The Fly Algorithm is an example of iterative reconstruction. Iterative methods in tomographic reconstruction are relatively easy to model:

where is an estimate of , that minimises an error metrics (here 2-norm, but other error metrics could be used) between and . Note that a regularisation term can be introduced to prevent overfitting and to smooth noise whilst preserving edges. Iterative methods can be implemented as follows:

Iterative correction in tomography reconstruction. Iterative-algorithm.svg
Iterative correction in tomography reconstruction.
  (i) The reconstruction starts using an initial estimate of the image (generally a constant image),   (ii) Projection data is computed from this image,   (iii) The estimated projections are compared with the measured projections,   (iv) Corrections are made to correct the estimated image, and   (v) The algorithm iterates until convergence of the estimated and measured projection sets.

The pseudocode below is a step-by-step description of the Fly Algorithm for tomographic reconstruction. The algorithm follows the steady-state paradigm. For illustrative purposes, advanced genetic operators, such as mitosis, dual mutation, etc. [22] [23] are ignored. A JavaScript implementation can be found on Fly4PET.


algorithm fly-algorithm isinput: number of flies (N),             input projection data (preference)          output: the fly population (F),              the projections estimated from F (pestimated)             the 3-D volume corresponding to the voxelisation of F (VF)          postcondition: the difference between pestimated and preference is minimal.          START       1.   // Initialisation  2.   // Set the position of the N flies, i.e. create initial guess  3.   for each fly iin fly population Fdo  4.       F(i)x random(0, 1)  5.       F(i)y random(0, 1)  6.       F(i)z random(0, 1)  7.       Add F(i)'s projection in pestimated  8.     9.   // Compute the population's performance (i.e. the global fitness) 10.   Gfitness(F) Errormetrics(preference, pestimated) 11.     12.   fkill Select a random fly of F 13.     14.   Remove fkill's contribution from pestimated 15.     16.   // Compute the population's performance without fkill 17.   Gfitness(F-{fkill}) Errormetrics(preference, pestimated) 18.     19.   // Compare the performances, i.e. compute the fly's local fitness 20.   Lfitness(fkill) Gfitness(F-{fkill}) - Gfitness(F) 21.     22.   If the local fitness is greater than 0, // Thresholded-selection of a bad fly that can be killed 23.       then go to Step 26.   // fkill is a good fly (the population's performance is better when fkill is included): we should not kill it 24.       else go to Step 28.   // fkill is a bad fly (the population's performance is worse when fkill is included): we can get rid of it 25.     26.   Restore the fly's contribution, then go to Step 12. 27.     28.   Select a genetic operator 29.     30.   If the genetic operator is mutation, 31.       then go to Step 34. 32.       else go to Step 50. 33.     34.   freproduce Select a random fly of F 35.     14.   Remove freproduce's contribution from pestimated 37.     38.   // Compute the population's performance without freproduce 39.   Gfitness(F-{freproduce}) Errormetrics(preference, pestimated) 40.     41.   // Compare the performances, i.e. compute the fly's local fitness 42.   Lfitness(freproduce) Gfitness(F-{freproduce}) - Gfitness(F) 43.     44.   Restore the fly's contribution 45.     46.   If the local fitness is lower than or equal to 0, // Thresholded-selection of a good fly that can reproduce 47.       else go to Step 34.   // freproduce is a bad fly: we should not allow it to reproduce 48.       then go to Step 53.   // freproduce is a good fly: we can allow it to reproduce 49.     50.   // New blood / Immigration 51.   Replace fkill by a new fly with a random position, go to Step 57. 52.     53.   // Mutation 54.   Copy freproduce into fkill 55.   Slightly and randomly alter fkill's position 56.     57.   Add the new fly's contribution to the population 58.     59.   If stop the reconstruction, 60.       then go to Step 63. 61.       else go to Step 10. 62.     63.   // Extract solution 64.   VF voxelisation of F 65.     66.   returnVFEND

Example: Digital arts

Evolutionary search Y Lliwedd flies.gif
Evolutionary search.
Y Lliwedd flies.png
Image reconstructed after optimisation using a set of stripes as the pattern for each tile.

In this example, an input image is to be approximated by a set of tiles (for example as in an ancient mosaic). A tile has an orientation (angle θ), a three colour components (R, G, B), a size (w, h) and a position (x, y, z). If there are N tiles, there are 9N unknown floating point numbers to guess. In other words for 5,000 tiles, there are 45,000 numbers to find. Using a classical evolutionary algorithm where the answer of the optimisation problem is the best individual, the genome of an individual would be made up of 45,000 genes. This approach would be extremely costly in term of complexity and computing time. The same applies for any classical optimisation algorithm. Using the Fly Algorithm, every individual mimics a tile and can be individually evaluated using its local fitness to assess its contribution to the population's performance (the global fitness). Here an individual has 9 genes instead of 9N, and there are N individuals. It can be solved as a reconstruction problem as follows:

where is the input image, and are the pixel coordinates along the horizontal and vertical axis respectively, and are the image width and height in number of pixels respectively, is the fly population, and is a projection operator that creates an image from flies. This projection operator can take many forms. In her work, Z. Ali Aboodd [16] uses OpenGL to generate different effects (e.g. mosaics, or spray paint). For speeding up the evaluation of the fitness functions, OpenCL is used too. The algorithm starts with a population that is randomly generated (see Line 3 in the algorithm above). is then assessed using the global fitness to compute (see Line 10). is the objective function that has to be minimized.

See also

Related Research Articles

In artificial intelligence, genetic programming (GP) is a technique of evolving programs, starting from a population of unfit programs, fit for a particular task by applying operations analogous to natural genetic processes to the population of programs.

<span class="mw-page-title-main">Genetic algorithm</span> Competitive algorithm for searching a problem space

In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection. Some examples of GA applications include optimizing decision trees for better performance, solving sudoku puzzles, hyperparameter optimization, causal inference, etc.

Fitness is the quantitative representation of individual reproductive success. It is also equal to the average contribution to the gene pool of the next generation, made by the same individuals of the specified genotype or phenotype. Fitness can be defined either with respect to a genotype or to a phenotype in a given environment or time. The fitness of a genotype is manifested through its phenotype, which is also affected by the developmental environment. The fitness of a given phenotype can also be different in different selective environments.

<span class="mw-page-title-main">Evolutionary algorithm</span> Subset of evolutionary computation

In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions. Evolution of the population then takes place after the repeated application of the above operators.

In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they are a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character.

<span class="mw-page-title-main">Muller's ratchet</span> Accumulation of harmful mutations

In evolutionary genetics, Muller's ratchet is a process which, in the absence of recombination, results in an accumulation of irreversible deleterious mutations. This happens because in the absence of recombination, and assuming reverse mutations are rare, offspring bear at least as much mutational load as their parents. Muller proposed this mechanism as one reason why sexual reproduction may be favored over asexual reproduction, as sexual organisms benefit from recombination and consequent elimination of deleterious mutations. The negative effect of accumulating irreversible deleterious mutations may not be prevalent in organisms which, while they reproduce asexually, also undergo other forms of recombination. This effect has also been observed in those regions of the genomes of sexual organisms that do not undergo recombination.

Mutation is a genetic operator used to maintain genetic diversity of the chromosomes of a population of a genetic or, more generally, an evolutionary algorithm (EA). It is analogous to biological mutation.

<span class="mw-page-title-main">Ant colony optimization algorithms</span> Optimization algorithm

In computer science and operations research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems which can be reduced to finding good paths through graphs. Artificial ants stand for multi-agent methods inspired by the behavior of real ants. The pheromone-based communication of biological ants is often the predominant paradigm used. Combinations of artificial ants and local search algorithms have become a method of choice for numerous optimization tasks involving some sort of graph, e.g., vehicle routing and internet routing.

In computer science, an evolution strategy (ES) is an optimization technique based on ideas of evolution. It belongs to the general class of evolutionary computation or artificial evolution methodologies.

<span class="mw-page-title-main">Tomographic reconstruction</span> Estimate object properties from a finite number of projections

Tomographic reconstruction is a type of multidimensional inverse problem where the challenge is to yield an estimate of a specific system from a finite number of projections. The mathematical basis for tomographic imaging was laid down by Johann Radon. A notable example of applications is the reconstruction of computed tomography (CT) where cross-sectional images of patients are obtained in non-invasive manner. Recent developments have seen the Radon transform and its inverse used for tasks related to realistic object insertion required for testing and evaluating computed tomography use in airport security.

<span class="mw-page-title-main">Iterative reconstruction</span>

Iterative reconstruction refers to iterative algorithms used to reconstruct 2D and 3D images in certain imaging techniques. For example, in computed tomography an image must be reconstructed from projections of an object. Here, iterative reconstruction techniques are usually a better, but computationally more expensive alternative to the common filtered back projection (FBP) method, which directly calculates the image in a single reconstruction step. In recent research works, scientists have shown that extremely fast computations and massive parallelism is possible for iterative reconstruction, which makes iterative reconstruction practical for commercialization.

Genetic load is the difference between the fitness of an average genotype in a population and the fitness of some reference genotype, which may be either the best present in a population, or may be the theoretically optimal genotype. The average individual taken from a population with a low genetic load will generally, when grown in the same conditions, have more surviving offspring than the average individual from a population with a high genetic load. Genetic load can also be seen as reduced fitness at the population level compared to what the population would have if all individuals had the reference high-fitness genotype. High genetic load may put a population in danger of extinction.

<span class="mw-page-title-main">Estimation of distribution algorithm</span>

Estimation of distribution algorithms (EDAs), sometimes called probabilistic model-building genetic algorithms (PMBGAs), are stochastic optimization methods that guide the search for the optimum by building and sampling explicit probabilistic models of promising candidate solutions. Optimization is viewed as a series of incremental updates of a probabilistic model, starting with the model encoding an uninformative prior over admissible solutions and ending with the model that generates only the global optima.

A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm (GA) or more general evolutionary algorithm (EA). It may provide a sufficiently good solution to an optimization problem. It uses a suitable heuristic or local search technique to improve the quality of solutions generated by the EA and to reduce the likelihood of premature convergence.

<span class="mw-page-title-main">Discrete tomography</span> Reconstruction of binary images from a small number of their projections

Discrete tomography focuses on the problem of reconstruction of binary images from a small number of their projections.

<span class="mw-page-title-main">Algebraic reconstruction technique</span> Technique in computed tomography

The algebraic reconstruction technique (ART) is an iterative reconstruction technique used in computed tomography. It reconstructs an image from a series of angular projections. Gordon, Bender and Herman first showed its use in image reconstruction; whereas the method is known as Kaczmarz method in numerical linear algebra.

Natural evolution strategies (NES) are a family of numerical optimization algorithms for black box problems. Similar in spirit to evolution strategies, they iteratively update the (continuous) parameters of a search distribution by following the natural gradient towards higher expected fitness.

Biogeography-based optimization (BBO) is an evolutionary algorithm (EA) that optimizes a function by stochastically and iteratively improving candidate solutions with regard to a given measure of quality, or fitness function. BBO belongs to the class of metaheuristics since it includes many variations, and since it does not make any assumptions about the problem and can therefore be applied to a wide class of problems.

In iterative reconstruction in digital imaging, interior reconstruction (also known as limited field of view (LFV) reconstruction) is a technique to correct truncation artifacts caused by limiting image data to a small field of view. The reconstruction focuses on an area known as the region of interest (ROI). Although interior reconstruction can be applied to dental or cardiac CT images, the concept is not limited to CT. It is applied with one of several methods.

<span class="mw-page-title-main">Dispersive flies optimisation</span>

Dispersive flies optimisation (DFO) is a bare-bones swarm intelligence algorithm which is inspired by the swarming behaviour of flies hovering over food sources. DFO is a simple optimiser which works by iteratively trying to improve a candidate solution with regard to a numerical measure that is calculated by a fitness function. Each member of the population, a fly or an agent, holds a candidate solution whose suitability can be evaluated by their fitness value. Optimisation problems are often formulated as either minimisation or maximisation problems.

References

  1. Collet, Pierre; Louchet, Jean (Oct 2009). "Artificial evolution and the Parisian approach: applications in the processing of signals and images". In Siarry, Patrick (ed.). Optimization in Signal and Image Processing. Wiley-ISTE. ISBN   9781848210448.
  2. 1 2 3 Louchet, Jean (Feb 2000). L'algorithme des mouches : une stratégie d'évolution individuelle appliquée en stéréovision. Reconnaissance des Formes et Intelligence Artificielle (RFIA2000).
  3. 1 2 3 Louchet, Jean (Sep 2000). Stereo analysis using individual evolution strategy. Proceedings of 15th International Conference on Pattern Recognition, 2000 (ICPR’00). Barcelona, Spain: IEEE. pp. 908–911. doi:10.1109/ICPR.2000.905580. ISBN   0-7695-0750-6.
  4. 1 2 Louchet, Jean (Jun 2001). "Using an Individual Evolution Strategy for Stereovision". Genetic Programming and Evolvable Machines. 2 (2): 101–109. doi:10.1023/A:1011544128842. S2CID   8953837.
  5. 1 2 Boumaza, Amine; Louchet, Jean (Apr 2003). "Mobile robot sensor fusion using flies". Lecture Notes on Computer Science. European Conference on Genetic Programming (EuroGP 2003). Vol. 2611. Essex, UK: Springer. pp. 357–367. doi:10.1007/3-540-36605-9_33. ISBN   978-3-540-00976-4.
  6. 1 2 Louchet, Jean; Guyon, Maud; Lesot, Marie-Jeanne; Boumaza, Amine (Mar 2002). "L'algorithme des mouches dynamiques: guider un robot par évolution artificielle en temps réel" (PDF). In Lattaud, Claude (ed.). Apprentissage Automatique et Evolution Artificielle (in French). Hermes Sciences Publications. ISBN   978-2746203600.
  7. 1 2 Louchet, Jean; Guyon, Maud; Lesot, Marie-Jeanne; Boumaza, Amine (Jan 2002). "Dynamic Flies: a new pattern recognition tool applied to stereo sequence processing" (PDF). Pattern Recognition Letters. 23 (1–3): 335–345. doi:10.1016/S0167-8655(01)00129-5.
  8. 1 2 Boumaza, Amine; Louchet, Jean (Apr 2001). "Dynamic Flies: Using Real-time evolution in Robotics". Lecture Notes on Computer Science. Artificial Evolution in Image Analysis and Signal Processing (EVOIASP2001). Vol. 2037. Como, Italy: Springer. pp. 288–297. doi:10.1007/3-540-45365-2_30. ISBN   978-3-540-41920-4.
  9. 1 2 Louchet, Jean; Sapin, Emmanuel (2009). "Flies Open a Door to SLAM.". Lecture Notes in Computer Science. Applications of Evolutionary Computation (EvoApplications 2009). Vol. 5484. Tübingen, Germany: Springer. pp. 385–394. doi:10.1007/978-3-642-01129-0_43.
  10. 1 2 Bousquet, Aurélie; Louchet, Jean-Marie; Rocchisani, Jean (Oct 2007). "Fully Three-Dimensional Tomographic Evolutionary Reconstruction in Nuclear Medicine" (PDF). Lecture Notes in Computer Science. Proceedings of the 8th international conference on Artificial Evolution (EA’07). Vol. 4926. Tours, France: Springer, Heidelberg. pp. 231–242. doi:10.1007/978-3-540-79305-2_20. ISBN   978-3-540-79304-5.
  11. 1 2 Vidal, Franck P.; Lazaro-Ponthus, Delphine; Legoupil, Samuel; Louchet, Jean; Lutton, Évelyne; Rocchisani, Jean-Marie (Oct 2009). "Artificial evolution for 3D PET reconstruction" (PDF). Lecture Notes in Computer Science. Proceedings of the 9th international conference on Artificial Evolution (EA’09). Vol. 5975. Strasbourg, France: Springer, Heidelberg. pp. 37–48. doi:10.1007/978-3-642-14156-0_4. ISBN   978-3-642-14155-3.
  12. 1 2 Vidal, Franck P.; Louchet, Jean; Lutton, Évelyne; Rocchisani, Jean-Marie (Oct–Nov 2009). "PET reconstruction using a cooperative coevolution strategy in LOR space". IEEE Nuclear Science Symposium Conference Record (NSS/MIC), 2009. Medical Imaging Conference (MIC). Orlando, Florida: IEEE. pp. 3363–3366. doi:10.1109/NSSMIC.2009.5401758.
  13. 1 2 Vidal, Franck P.; Louchet, Jean; Rocchisani, Jean-Marie; Lutton, Évelyne (Apr 2010). "New genetic operators in the Fly Algorithm: application to medical PET image reconstruction" (PDF). Lecture Notes in Computer Science. European Workshop on Evolutionary Computation in Image Analysis and Signal Processing (EvoIASP’10). Vol. 6024. Istanbul, Turkey: Springer, Heidelberg. pp. 292–301. doi:10.1007/978-3-642-12239-2_30. ISBN   978-3-642-12238-5.
  14. 1 2 Vidal, Franck P.; Lutton, Évelyne; Louchet, Jean; Rocchisani, Jean-Marie (Sep 2010). "Threshold selection, mitosis and dual mutation in cooperative coevolution: application to medical 3D tomography" (PDF). Lecture Notes in Computer Science. International Conference on Parallel Problem Solving From Nature (PPSN'10). Vol. 6238. Krakow, Poland: Springer, Heidelberg. pp. 414–423. doi:10.1007/978-3-642-15844-5_42.
  15. 1 2 Ali Abbood, Zainab; Lavauzelle, Julien; Lutton, Évelyne; Rocchisani, Jean-Marie; Louchet, Jean; Vidal, Franck P. (2017). "Voxelisation in the 3-D Fly Algorithm for PET" (PDF). Swarm and Evolutionary Computation. 36: 91–105. doi:10.1016/j.swevo.2017.04.001. ISSN   2210-6502.
  16. 1 2 3 Ali Abbood, Zainab; Amlal, Othman; Vidal, Franck P. (Apr 2017). "Evolutionary Art Using the Fly Algorithm" (PDF). Lecture Notes in Computer Science. Applications of Evolutionary Computation (EvoApplications 2017). Vol. 10199. Amsterdam, the Netherlands: Springer. pp. 455–470. doi:10.1007/978-3-319-55849-3_30.
  17. Mesejo, Pablo; Ibanez, Oscar; Fernandez-blanco, Enrique; Cedron, Francisco; Pazos, Alejandro; Porto-pazos, Ana (2015). "Artificial Neuron – Glia Networks Learning Approach Based on Cooperative Coevolution" (PDF). International Journal of Neural Systems. 25 (4): 1550012. doi:10.1142/S0129065715500124. hdl:2183/17502. PMID   25843127.
  18. Kennedy, J; Eberhart, R (1995). Particle swarm optimization. Proceedings of IEEE International Conference on Neural Networks. IEEE. pp. 1942–1948. doi:10.1109/ICNN.1995.488968.
  19. Shi, Y; Eberhart, R (1998). A modified particle swarm optimizer. Proceedings of IEEE International Conference on Evolutionary Computation. IEEE. pp. 69–73. doi:10.1109/ICEC.1998.699146.
  20. Abbood, Zainab Ali; Vidal, Franck P. (2017). "Basic, Dual, Adaptive, and Directed Mutation Operators in the Fly Algorithm". Lecture Notes in Computer Science. 13th Biennal International Conference on Artificial Evolution (EA-2017). Paris, France. pp. 106–119. ISBN   978-2-9539267-7-4.
  21. Abbood, Zainab Ali; Vidal, Franck P. (Oct 2017). "Fly4Arts: Evolutionary Digital Art with the Fly Algorithm". Art and Science. 17–1 (1): 1–6. doi: 10.21494/ISTE.OP.2017.0177 .
  22. Vidal, Franck P.; Lutton, Évelyne; Louchet, Jean; Rocchisani, Jean-Marie (Sep 2010). "Threshold selection, mitosis and dual mutation in cooperative co-evolution: Application to medical 3D tomography" (PDF). Lecture Notes in Computer Science. Parallel Problem Solving from Nature - PPSN XI. Vol. 6238. Kraków, Poland: Springer Berlin / Heidelberg. pp. 414–423. doi:10.1007/978-3-642-15844-5_42. ISBN   978-3-642-15843-8.
  23. Ali Abbood, Zainab; Vidal, Franck P. (Oct 2017). "Basic, Dual, Adaptive, and Directed Mutation Operators in the Fly Algorithm". Lecture Notes in Computer Science. 13th Biennal International Conference on Artificial Evolution. Paris, France: Springer-Verlag.