Constriction factors and Parameters

This section details the variations that have been studied on the basic PSO equations with the purpose of improving some aspects of its performance. This review is based in  [1] and [2].

Constriction factors

In [3], the velocity clamping effect was introduced to avoid the phenomenon of "swarm explosion". With no restriction on the maximum velocity of the particles, a simple one-dimensional analysis of the swarm dynamic concludes that the particle velocity can grow unbounded while the particle oscillates around an optimum, increasing its distance to the optimum on each iteration. In several studies, the velocity clamping parameter $v_{max}$ was set to the maximum values for the positions in the search space (called $x_{max}$). Further studies demonstrated that this mechanism was not enough to properly control the particle's velocities. When compared to EC techniques [4] the PSO quickly identified the region where the optimum was located, but had trouble to adjust the velocity to lower values to perform a fine grained search of the area. In later versions of the PSO algorithm [5], the following modified velocity equation is used, where the velocity clamping technique is no longer needed:
\begin{displaymath}
v^{t+1}_{id} = \chi (w \cdot v^{t}_{id} + c_1 \cdot \psi_...
...x^{t}_{id}) + c_2 \cdot \psi_2 \cdot (p^{t}_{gd}-x^{t}_{id}))
\end{displaymath} (1)

The term $w$ is called "inertia weight", and the term $\chi$ is the "constriction factor". The term $c_1$ is the "cognitive parameter" and the term $c_2$ is the "social parameter". The combination of these parameters determines the convergence properties of the algorithm.

Parameter Values Selection

For the initial version of the PSO, the values for $c_1$, $c_2$, and $v_{max}$ have to be selected. This selection has an impact on the convergence speed and the ability of the algorithm to find the optimum, but different values may be better for different problems. Much work has been done to select a combination of values that works well in a wide range of problems. For the constricted version of PSO, the following restrictions are proposed [5]:

\begin{displaymath}
\chi = \frac{2}{\left\vert 2 - \varphi - \sqrt{\varphi^2 - 4 \cdot \varphi} \right\vert}
\end{displaymath} (2)
\begin{displaymath}
\varphi = c_1 + c_2 > 4.0
\end{displaymath} (3)

Regarding the inertia weight, it determines how the previous velocity of the particle influences the velocity in the next iteration:

In [6,7] a decaying inertia weight is proposed and tested, with the aim of favoring global search at the start of the algorithm and local search later. If the inertia weight is not reduced with time, they suggest selecting a value $w \in \left[0.8,1.2\right]$. The values of the cognitive and social factors are not critical for the algorithm, but selection of proper values may result in better performance, both in terms of speed of convergence and alleviation of local minima. Their values have to be taken into account when choosing the inertia weight. Several studies propose different values for these parameters, that are considered adequate for some of the usual benchmark functions.

$c_1$ $c_2$ $w_{init}$ $w_{final}$ Reference
2.0 2.0 1.0 1.0  [3]
2.0 2.0 0.9 0.4  [7]
1.4962 1.4962 0.7968 0.7968  [8]

In [5,8], theoretical studies of convergence were performed; in the latter, an heuristic procedure for parameter selection and tuning was proposed, based on the convergence speed of the different combinations. In [9], best results were achieved when both the constraint and the velocity clamping mechanisms were combined, using $v_{max} = x_{max}$.

Bibliography

1
K. E. Parsopoulos and M. N. Vrahatis.
Recent approaches to global optimization problems through particle swarm optimization.
Natural Computing: an international journal, 1(2-3):235-306, 2002.

2
Frans van den Bergh.
An analysis of particle swarm optimizers.
PhD thesis, University of Pretoria, South Africa, 2002.

3
J. Kennedy, R.C. Eberhart, and Y. Shi.
Swarm intelligence.
Morgan Kaufmann Publishers, San Francisco, 2001.

4
P.J. Angeline.
Evolutionary optimization versus particle swarm optimization: Philosophy and performance differences.
In Proceedings of the VII Conference on Evolutionary Programming, pages 601-610. Springer-Verlag GmbH, 1998.

5
Maurice Clerc and James Kennedy.
The particle swarm - explosion, stability, and convergence in a multidimensional complex space.
IEEE Trans. Evolutionary Computation, 6(1):58-73, 2002.

6
Y. Shi and R.C. Eberhart.
Parameter selection in particle swarm optimization.
In Proceedings of the Seventh Annual Conference on Evolutionary Programming, pages 591-600, 1998.

7
Y. Shi and R.C. Eberhart.
Empirical study of particle swarm optimization.
In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), pages 1945-1950, 1999.

8
Ioan Cristian Trelea.
The particle swarm optimization algorithm: convergence analysis and parameter selection.
Inf. Process. Lett., 85(6):317-325, 2003.

9
Russ C. Eberhart and Y. Shi.
Comparing inertia weights and constriction factors in particle swarm optimization.
In Proceedings of the Congress on Evolutionary Computing, pages 84-89, 2000.
Copyright ©2005, University CARLOS III of Madrid