12–16 Apr 2010
Uppsala University
Europe/Stockholm timezone

Grid-enabled parameter initialization for high performance machine learning tasks

14 Apr 2010, 12:00
20m
Room IX (Uppsala University)

Room IX

Uppsala University

Oral Scientific results obtained using distributed computing technologies Computer Science

Speakers

Mr Fotis Psomopoulos (Aristotle University of Thessaloniki)Mr Kyriakos Chatzidimitriou (Aristotle University of Thessaloniki)

Description

In this work we use the NeuroEvolution of Augmented Topologies (NEAT) methodology, for optimising Echo State Networks (ESNs), in order to achieve high performance in machine learning tasks. The large parameter space of NEAT, the many variations of ESNs and the stochastic nature of evolutionary computation, requiring many evaluations for statistically valid conclusions, promotes the Grid as a viable solution for robustly evaluating the alternatives and deriving significant conclusions.

Conclusions and Future Work

Our goal is to demonstrate the use of Grid resources for a) the optimisation of our algorithm for the topologically and weighted evolution of ESNs in terms of parameter fine-tuning and b) the macroscopic observation of the algorithm behaviour to different setups in order to eventually improve the underlying mechanisms. Our ongoing efforts include improving the algorithm in terms of speed, accuracy, generalisation and robustness and augmenting our pool of test cases for better performance evaluation.

Impact

NEAT and ESN represent two of the most influential algorithms in the areas of neuroevolution and reservoir computing, respectively. To the best of our knowledge, there is no reported research on parameter initialization for NEAT, while for ESNs, reservoir optimisation is an active area of research. Their explicit combination has not been tried before and, in order to avoid local optimum behaviour, parameter fine-tuning is routed through Grid resources. Additionally, we believe that this study will provide a better intuition on how to improve such algorithms by studying macroscopic features, since microscopic observation is extremely difficult when evolutionary computation and neural networks are involved. The Grid enables such a strategy. The optimised results can be compared to recent efforts of optimising reservoirs and networks in the same domains. The overall goal is to make the algorithm more computationally efficient and thus applicable to real-life tasks.

Detailed analysis

Both NEAT and ESN involve a large parameter space of around 23 continuous parameters for the former and about a dozen for the latter. Even though our joint approach slightly reduces these numbers, the order of complexity remains the same. Our aim is to discover parameter areas that drive the algorithm optimally while preserving its sensitivity to parameter variation. We apply Grid-enabled brute force search to parameter setups, with values chosen from small expert-created sets containing 2-4 parameters at a time. Performance is measured on three time-series benchmarks widely used in the community of ESNs: a) the Mackey-Glass system, b) the Multiple Superimposed Oscillator and c) the Lorentz attractor as well as on Reinforcement Learning testbeds including a) the single and double pole balancing and b) the 2D and 3D mountain car tasks. An additional search is made on the vicinity of the optimal parameter values and for the most influential ones. Statistical significance is also measured.

URL for further information http://issel.ee.auth.gr/
Keywords Neuroevolution, Echo State Networks, Parameter optimisation

Primary authors

Mr Fotis Psomopoulos (Aristotle University of Thessaloniki) Mr Kyriakos Chatzidimitriou (Aristotle University of Thessaloniki)

Co-author

Prof. Pericles Mitkas (Aristotle University of Thessaloniki)

Presentation materials