An Unpopular Optimization Algorithm: Stochastic Fractal Search

Mohammed El komy
Analytics Vidhya
Published in
3 min readJan 28, 2021

--

Evolutionary algorithms are naturally inspired approximation optimization algorithms used in many scientific problems, mostly used to provide plausible approximate solutions when exact solutions require an unreasonable amount of time using traditional exhaustive brute-force search techniques for space exploration.

This post dives into one successful metaheuristic algorithm, called stochastic fragment search, originally introduced by Dr. Hamid Salimi, in his article.

As the name suggests, Stochastic Fractal Search (SFS) uses a concept called the fractal in the form of stochasticity to search a certain search space.

If you studied computer graphics, you should have admired those impressive recursive patterns called fractal images.

(Left): Sierpinski triangle, (Right): Mandelbrot Set

As shown above in those fascinating GIFs, on the left is an animation of the Sierpinski triangle which is created by recursively dividing a triangle into four triangles using three lines, while on the right is the very famous Mandelbrot set which is based on a recursive complex function that reveals progressively ever-finer recursive detail at increasing magnifications.

Don’t be misled by the GIFs but those shapes cannot be fully represented by computer memory because they are infinite, so they are the largest number you can think of! Undoubtedly, there’s no such number that fits in any conceivable memory!

Formally, stochastic fractal search is a meta-heuristic optimization algorithm that uses the idea of fractals to satisfy the intensification (exploitation) property needed by optimization algorithms and the stochasticity feature to guarantee the diversification (exploration) of the search space.

If you are interested in the development of the algorithm, you may check the original work because SFS is just an improvement of a yet simpler algorithm called Fractal Search.

Algorithm Steps:

SFS mainly has two steps:
1. Diffusion process (exploitation):
Based on the concept of fractals mentioned above, the algorithm applies what’s called a diffusion process by producing new points after taking a random walk step.

Fractal Growth Using diffusion-limited aggregation (Fig 1 in the original article)

2. Update process (exploration):
is the process of generating new solutions by mixing them with other solutions randomly. This is done through two update steps; the first step involves element-wise updates, while the second update step is just a weighted sum of multiple vectors in the population.

More details on the math can be found in the original article and my report survey (which includes some selected applications ranging from computer vision to PID controller parameter tuning)

Demo:

Optimizing the Styblinski Tang function using SFS:

[Demo] Red spheres are candidate solutions in the population of the search algorithm

Code:

The original creator of SFS has published the Matlab code of the algorithm via MATLAB Central File Exchange.
I also have a python re-implementation of the algorithm shared on GitHub under the GPL-3.0 License, Feel free to check it and contribute :)

That’s it for today :)

I used gifski open-source tool for generating high-quality gifs 🎊🎊

Resources:
The original article https://www.sciencedirect.com/science/article/abs/pii/S0950705114002822
The original Matlab implementation https://www.mathworks.com/matlabcentral/fileexchange/47565-stochastic-fractal-search-sfs
My python implementation https://github.com/mohammed-elkomy/stochastic-fractal-search-python
My report survey
https://arxiv.org/abs/2102.01503

--

--

Mohammed El komy
Analytics Vidhya

Teaching and Research Assistant, A deep learning enthusiast, Passionate about Algorithms and problem-solving, An Origami practitioner.