WestminsterResearch

Genetic evolution of controllers for challenging control problems

Dracopoulos, Dimitris (2011) Genetic evolution of controllers for challenging control problems. Journal of Computational Methods in Science and Engineering, 11 (4). pp. 227-242. ISSN 1472-7978

Full text not available from this repository.

Official URL: http://dx.doi.org/10.3233/JCM-2011-0388

Abstract

The automatic construction of controllers would be ideal in situations where traditional control theory and algorithms fail, as it is the case with certain dynamical systems. Genetic programming, a field under the "umbrella" of evolutionary computation, is capable of creating computer programs given a high level description of a problem. The evolution of such computer programs is driven by their fitness. The fitness is defined by an objective function, which measures how well a particular program performs for the specific problem that tries to solve. Any controller can be described in terms of a computer program and thus, at least in theory, genetic programming offers an ideal candidate for the automatic construction of controllers. This paper considers the application of genetic programming on two different problems: the aircraft autolanding problem and the bioreactor control problem, both of which have been suggested in the literature as challenging benchmarks in the quest for building automatic controllers. The results presented here show that successful control laws in analytic form are derived for both cases.

Item Type:Article
Research Community:University of Westminster > Electronics and Computer Science, School of
ID Code:10977
Deposited On:30 Jul 2012 16:05
Last Modified:14 May 2013 13:01

Repository Staff Only: item control page