Local Function Approximation in Evolutionary Algorithms for the Optimization of Costly Functions
- 25 October 2004
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Evolutionary Computation
- Vol. 8 (5), 490-505
- https://doi.org/10.1109/tevc.2004.835247
Abstract
We develop an approach for the optimization of continuous costly functions that uses a space-filling experimental design and local function approximation to reduce the number of function evaluations in an evolutionary algorithm. Our approach is to estimate the objective function value of an offspring by fitting a function approximation model over the k nearest previously evaluated points, where k=(d+1)(d+2)/2 and d is the dimension of the problem. The estimated function values are used to screen offspring to identify the most promising ones for function evaluation. To fit function approximation models, a symmetric Latin hypercube design (SLHD) is used to determine initial points for function evaluation. We compared the performance of an evolution strategy (ES) with local quadratic approximation, an ES with local cubic radial basis function (RBF) interpolation, an ES whose initial parent population comes from an SLHD, and a conventional ES. These algorithms were applied to a twelve-dimensional (12-D) groundwater bioremediation problem involving a complex nonlinear finite-element simulation model. The performances of these algorithms were also compared on the Dixon-Szego test functions and on the ten-dimensional (10-D) Rastrigin and Ackley test functions. All comparisons involve analysis of variance (ANOVA) and the computation of simultaneous confidence intervals. The results indicate that ES algorithms with local approximation were significantly better than conventional ES algorithms and ES algorithms initialized by SLHDs on all Dixon-Szego test functions except for Goldstein-Price. However, for the more difficult 10-D and 12-D functions, only the cubic RBF approach was successful in improving the performance of an ES. Moreover, the results also suggest that the cubic RBF approach is superior to the quadratic approximation approach on all test functions and the difference in performance is statistically significant for all test functions with dimension d/spl ges/4.Keywords
This publication has 30 references indexed in Scilit:
- Comparison of methods for developing dynamic reduced models for design optimizationSoft Computing, 2003
- A comprehensive survey of fitness approximation in evolutionary computationSoft Computing, 2003
- A framework for evolutionary optimization with approximate fitness functionsIEEE Transactions on Evolutionary Computation, 2002
- A combined method for the global optimization using radial basis function and deterministic approachIEEE Transactions on Magnetics, 1999
- Accelerating the convergence of evolutionary algorithms by fitness landscape approximationLecture Notes in Computer Science, 1998
- An optimization method based on radial basis functionIEEE Transactions on Magnetics, 1997
- Exploratory designs for computational experimentsJournal of Statistical Planning and Inference, 1995
- Design and Analysis of Computer ExperimentsStatistical Science, 1989
- A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output from a Computer CodeTechnometrics, 1979
- Outline for a Logical Theory of Adaptive SystemsJournal of the ACM, 1962