The sixth lecture takes place on the 15th of June 2022 at 4:30 PM (CEST), virtually (we will give an update if attendance in-person becomes possible).

The zoom link to attend the lecture is: https://universiteitleiden.zoom.us/j/68788156198?pwd=ZWtrS1NMbmtvL1RVaHkxcmZPckgvZz09

The lecture features Postdoc Furong Ye and PhD candidate Diederick Vermetten here’s more details:

## Talk 1: The impact of cost metrics on algorithm configuration

Finding the best configuration of algorithms’ hyperparameters for a given optimization problem is an important task in evolutionary computation. While performing an algorithm configuration task, we need to determine the objective, i.e., cost metric. Concerning different cost metrics, the configurators can present different performances and obtain various configurations. Therefore, it is interesting to investigate the impact of cost metrics on the configurators. In this talk, I will present our results of four different approaches for a family of genetic algorithms on 25 diverse pseudo-Boolean optimization problems. The results suggest that even when interested in expected running time (ERT) performance, it might be preferable to use anytime performance measures (AUC) for the configuration task. We also observe that tuning for expected running time is much more sensitive with respect to the budget that is allocated to the target algorithms.

**Furong Ye** is a Postdoc at the Leiden Institute of Advanced Computer Science (LIACS), after finishing his PhD study at LIACS. His PhD topic is “Benchmarking discrete optimization heuristics: From building a sound experimental environment to algorithm configuration”. He is part of the core development team of IOHprofiler, with a focus on the IOHexperimenter. His research interests are the empirical analysis of algorithm performance and (dynamic) algorithm configuration.

## Talk 2: Benchmarking as a steppingstone to dynamic algorithm selection

When comparing optimization heuristics, we typically benchmark them on a pre-defined set of problems and check which one performs better. However, the benchmarking procedure gives us a lot more information than just which algorithm seems to work best in a particular context. The performance profile of an algorithm tells something about its underlying behavior, and this knowledge can potentially be exploited. By combining the performance trajectories of two different algorithms, we can obtain a theoretically dynamic algorithm, which performs a single switch during the optimization, and outperforms both component algorithms. In this talk, we will discuss how detailed benchmark data was used to show the potential of dynamic algorithm selection, and what further challenges remain.

**Diederick Vermetten** is a PhD candidate at the Leiden Institute of Advanced Computer Science (LIACS). His research interests include benchmarking of optimization heuristics, dynamic algorithm selection and configuration as well as hyperparameter optimization. He is part of the core development team of IOHprofiler, with a focus on the IOHanalyzer.