No Cover Image

Book chapter 821 views 166 downloads

Towards Better Integration of Surrogate Models and Optimizers

Tinkle Chugh, Alma Rahat Orcid Logo, Vanessa Volz, Martin Zaefferer

High-Performance Simulation-Based Optimization, Volume: Chapter 7, Pages: 137 - 163

Swansea University Author: Alma Rahat Orcid Logo

Abstract

Surrogate-Assisted Evolutionary Algorithms (SAEAs) have been proven to be very effective in solving (synthetic and real-world) computationally expensive optimization problems with a limited number of function evaluations. The two main components of SAEAs are: the surrogate model and the evolutionary...

Full description

Published in: High-Performance Simulation-Based Optimization
ISBN: 9783030187637 9783030187644
ISSN: 1860-949X 1860-9503
Published: Cham Springer International Publishing 2020
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa52249
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract: Surrogate-Assisted Evolutionary Algorithms (SAEAs) have been proven to be very effective in solving (synthetic and real-world) computationally expensive optimization problems with a limited number of function evaluations. The two main components of SAEAs are: the surrogate model and the evolutionary optimizer, both of which use parameters to control their respective behavior. These parameters are likely to interact closely, and hence the exploitation of any such relationships may lead to the design of an enhanced SAEA. In this chapter, as a first step, we focus on Kriging and the Efficient Global Optimization (EGO) framework. We discuss potentially profitable ways of a better integration of model and optimizer. Furthermore, we investigate in depth how different parameters of the model and the optimizer impact optimization results. In particular, we determine whether there are any interactions between these parameters, and how the problem characteristics impact optimization results. In the experimental study, we use the popular Black-Box Optimization Benchmarking (BBOB) testbed. Interestingly, the analysis finds no evidence for significant interactions between model and optimizer parameters, but independently their performance has a significant interaction with the objective function. Based on our results, we make recommendations on how best to configure EGO.
College: Faculty of Science and Engineering
Start Page: 137
End Page: 163