Motivation
An important component of most large software systems is the optimization of individual parts and their interconnection. For the task of solving hard optimization problems, Genetic Algorithms (GAs) have been used successfully for more than 50 years, tackling an ever growing range of problems. In fact, some researchers believe that GAs "are the next step forward from deep learning: the form of AI that can think outside the box. And it is this kind of creativity that we need to advance AI beyond its current achievements." [1, highlights added]
Classical approaches towards optimization, for example (integer) linear programming, require the objective function (or some sufficient approximation) to be explicitly available. However, for complex optimization problems, the (possibly high-dimensional) objective function is typically not given as a mathematical function, but merely accessible indirectly as a "black box" which can be queried for the quality of a proposed solution. The first, simple approaches to black-box optimization are concerned with local search: iteratively improve a solution by making local (small) changes. Building on these initial ideas and on inspiration from nature (in particular evolution), modern black-box optimization algorithms transcend the abilities of mere local search by employing a wealth of different strategies. In various applications, these nature-inspired algorithms have proven their worth: successful examples include the optimization of antennas for NASA, for the turbine geometry of the Boing 777 GE and the first clinically approved anti-viral drug for HIV.