Comparison of Evolution Strategy and Back-Propagation
for Estimating Parameters of Neural Network
Roman Malczyk, Ales Gottvald
Institute of Scientific Instruments, Academy of Sciences of the CR,
Kralovopolska 147, CZ-612 64 Brno, Czech Republic
E-mail: malczyk@isibrno.cz, gott@isibrno.cz
Abstract:
Evolution Strategy (ES) and Back-Propagation (BP) have been compared for
estimating parameters (synaptic weights and biases) of two classes of Neural Networks:
Sigmoidal Neural Networks (SNN) and Green's Regularization Networks (GRN). The following
features have been compared: globality of convergence, speed of convergence, asymptotic
behaviour, and stability of the solutions. On average, numerical experiments show a better
globality and a better asymptotic behaviour when using the (1+1)-Evolution Strategy
instead of the Back-Propagation. While the speed of convergence is comparable for both
the ES and BP, a simplicity of implementation makes the ES superior to BP. These
conclusions have been found for both the Sigmoidal and the Green's Regularization Nets.
Back to: