
Doerr, B., Doerr, C., Kötzing, T.: Provably Optimal SelfAdjusting Step Sizes for MultiValued Decision Variables. Parallel Problem Solving From Nature (PPSN). pp. 782791 (2016).
We regard the problem of maximizing a OneMaxlike function defined over an alphabet of size $r$. In previous work [GECCO 2016] we have investigated how three different mutation operators influence the performance of Randomized Local Search (RLS) and the (1+1) Evolutionary Algorithm. This work revealed that among these natural mutation operators none is superior to the other two for any choice of $r$. We have also given in [GECCO 2016] some indication that the best achiev able run time for large $r$ is $\Theta(n log r(log n + log r))$, regardless of how the mutation operator is chosen, as long as it is a static choice (i.e., the distribution used for variation of the current individual does not change over time). Here in this work we show that we can achieve a better performance if we allow for adaptive mutation operators. More precisely, we analyze the performance of RLS using a selfadjusting mutation strength. In this algorithm the size of the steps taken in each iteration depends on the success of previous iterations. That is, the mutation strength is increased after a successful iteration and it is decreased otherwise. We show that this idea yields an expected optimization time of $\Theta(n(log n + log r))$, which is optimal among all comparisonbased search heuristics. This is the first time that selfadjusting parameter choices are shown to outperform static choices on a discrete multivalued optimization problem.

Gao, W., Friedrich, T., Neumann, F.: FixedParameter Single Objective Search Heuristics for Minimum Vertex Cover. Parallel Problem Solving From Nature (PPSN). pp. 740750 (2016).
We consider how wellknown branching approaches for the classical minimum vertex cover problem can be turned into randomized initialization strategies with provable performance guarantees and investigate them by experimental investigations. Furthermore, we show how these techniques can be built into local search components and analyze a basic local search variant that is similar to a stateoftheart approach called NuMVC. Our experimental results for the two local search approaches show that making use of more complex branching strategies in the local search component can lead to better results on various benchmark graphs.

Friedrich, T., Kötzing, T., Krejca, M.S., Sutton, A.M.: Graceful Scaling on Uniform versus SteepTailed Noise. Parallel Problem Solving From Nature (PPSN). pp. 761770 (2016).
Recently, different evolutionary algorithms (EAs) have been analyzed in noisy environments. The most frequently used noise model for this was additive posterior noise (noise added after the fitness evaluation) taken from a Gaussian distribution. In particular, for this setting it was shown that the $(\mu + 1)$EA on OneMax does not scale gracefully (higher noise cannot efficiently be compensated by higher $\mu$). In this paper we want to understand whether there is anything special about the Gaussian distribution which makes the $(\mu + 1)$EA not scale gracefully. We keep the setting of posterior noise, but we look at other distributions. We see that for exponential tails the $(\mu + 1)$EA on OneMax does also not scale gracefully, for similar reasons as in the case of Gaussian noise. On the other hand, for uniform distributions (as well as other, similar distributions) we see that the $(\mu + 1)$EA on OneMax does scale gracefully, indicating the importance of the noise model.

Friedrich, T., Kötzing, T., Sutton, A.M.: On the Robustness of Evolving Populations. Parallel Problem Solving From Nature (PPSN). pp. 771781 (2016).
Most theoretical work that studies the benefit of recombination focuses on the ability of crossover to speed up optimization time on specific search problems. In this paper, we take a slightly different perspective and investigate recombination in the context of evolving solutions that exhibit \emphmutational robustness, i.e., they display insensitivity to small perturbations. Various models in population genetics have demonstrated that increasing the effective recombination rate promotes the evolution of robustness. We show this result also holds in the context of evolutionary computation by proving crossover promotes the evolution of robust solutions in the standard $(\mu+1)$ GA. Surprisingly, our results show that the effect is present even when robust solutions are at a selective disadvantage due to lower fitness values.

Dang, D.C., Lehre, P.K., Friedrich, T., Kötzing, T., Krejca, M.S., Oliveto, P.S., Sudholt, D., Sutton, A.M.: Emergence of Diversity and its Benefits for Crossover in Genetic Algorithms. Parallel Problem Solving From Nature (PPSN). pp. 890900 (2016).
Population diversity is essential for avoiding premature convergence in Genetic Algorithms (GAs) and for the effective use of crossover. Yet the dynamics of how diversity emerges in populations are not well understood. We use rigorous runtime analysis to gain insight into population dynamics and GA performance for a standard \((\mu+1)\) GA and the \(Jump_k\) test function. By studying the stochastic process underlying the size of the largest collection of identical genotypes we show that the interplay of crossover followed by mutation may serve as a catalyst leading to a sudden burst of diversity. This leads to improvements of the expected optimisation time of order \(\Omega(n/ \log n)\) compared to mutationonly algorithms like the \((1+1)\) EA.