03.04. 11:00 
A1.1 
Sebastian Siebertz 
An introduction to bounded expansion and nowhere dense graph classes
The notions of bounded expansion and nowhere denseness offer abstract and general definitions of uniform sparseness in graphs. Many familiar sparse graph classes, such as the planar graphs, and in fact all graphs excluding a fixed minor or topological minor, have bounded expansion and are nowhere dense. Classes with these properties admit efficient algorithms for local problems, such as the independent set problem, dominating set problem, and the modelchecking problem for firstorder logic. In this talk I will give an introduction to the theory of bounded expansion and nowhere dense graph classes with a focus on local separation properties that lead to efficient algorithms.

25.04. 13:30 
HE.52 
Hans Gawendowicz, Jannik Peters, Fabian Sommer, Daniel Stephan 
Maximizing Modularity in Big Graphs
Realworld graphs such as social networks often consist of multiple communities.
Finding these communities using the graph structure is an important task in data analytics.
A common heuristic to do so is the Louvain algorithm [Blondel et al. 2008], which measures
the quality of a solution using the notion of modularity, as defined by Newman and Girvan [Newman and Girvan 2004].
In the talk, we present two theorems that give additional insight into the structure of such communities and into
the low runtime of the Louvain algorithm.
We also motivate multiple modifications to the Louvain algorithm and measure their impact on the runtime and the
quality of the resulting communities in different graphs.
Furthermore, we provide experimental results for two conjectures which could be the subject of further research.

29.04. 11:00 
H2.57 
Master Project Team WS18 
On Strategic Schelling Segregation
Residential areas tend to segregate over time along racial/ethnical, religious or socioeconomic lines. To
demonstrate that this can happen even with very tolerant residents Schelling introduced a simple toy model
which has become influential in sociology. In Schelling’s model two types of agents are placed on a grid and
an agent is content with her location if the fraction of her neighbors on the grid which have the same type as
her is above \(\tau\) , for some \(0 < \tau < 1\). Discontent agents simply swap their location with a randomly chosen
other discontent agent or they jump to a random empty cell. Simulations of this process typically reveal that
the society of agents segregates into a few homogeneous regions even for \(\tau < 12\) , i.e., when all agents are
tolerant. Despite the popularity of Schelling’s model, many questions about its properties are still open and
some of them have been attacked only very recently.
This paper presents the next step towards a rigorous analysis. We map the boundary of when convergence to
an equilibrium is guaranteed. For this we prove several sharp threshold results where guaranteed convergence
suddenly turns into the strongest possible nonconvergence result: a violation of weak acyclicity. In particular,
we show such threshold results also for Schelling’s original model, which is in stark contrast to the standard
assumption in many empirical papers. On the conceptual side, we investigate generalizations of Schelling’s
model for more than two agent types and we allow more general underlying graphs which model the residential
area. We show that the number of agent types and the topology of the underlying graph heavily influence the
dynamic properties, the computational hardness of finding an optimum agent placement and the obtained
segregation strength. For the latter, we provide empirical results which indicate that geometry is essential for
strong segregation.

16.05. 13:30 
A1.1 
Thomas Bläsius 
On the Empirical Time Complexity of ScaleFree 3SAT at the Phase Transition
The hardness of formulas at the solubility phase transition of random propositional satisfiability (SAT) has been intensely studied for decades both empirically and theoretically. Solvers based on stochastic local search (SLS) appear to scale very well at the critical threshold, while complete backtracking solvers exhibit exponential scaling. On industrial SAT instances, this phenomenon is inverted: backtracking solvers can tackle large industrial problems, where SLSbased solvers appear to stall. Industrial instances exhibit sharply different structure than uniform random instances. Among many other properties, they are often heterogeneous in the sense that some variables appear in many while others appear in only few clauses.
We conjecture that the heterogeneity of SAT formulas alone already contributes to the tradeoff in performance between SLS solvers and complete backtracking solvers. We empirically determine how the run time of SLS vs. backtracking solvers depends on the heterogeneity of the input, which is controlled by drawing variables according to a scalefree distribution. Our experiments reveal that the efficiency of complete solvers at the phase transition is strongly related to the heterogeneity of the degree distribution. We report results that suggest the depth of satisfying assignments in complete search trees is influenced by the level of heterogeneity as measured by a powerlaw exponent. We also find that incomplete SLS solvers, which scale well on uniform instances, are not affected by heterogeneity. The main contribution of this paper utilizes the scalefree random 3SAT model to isolate heterogeneity as an important factor in the scaling discrepancy between complete and SLS solvers at the uniform phase transition found in previous works.

21.05. 11:00 
A1.1 
Warut Suksompong 
Schelling Games on Graphs
We consider strategic games that are inspired by Schelling’s model of residential segregation. In our model, the agents are partitioned into k types and need to select locations on an undirected graph. Agents can be either stubborn, in which case they will always choose their preferred location, or strategic, in which case they aim to maximize the fraction of agents of their own type in their neighborhood. We investigate the existence of equilibria in these games, study the complexity of finding an equilibrium outcome or an outcome with high social welfare, and also provide upper and lower bounds on the price of anarchy and stability. Some of our results extend to the setting where the preferences of the agents over their neighbors are defined by a social network rather than a partition into types. (Joint work with Edith Elkind, Jiarui Gan, Ayumi Igarashi, and Alexandros Voudouris)

22.05. 13:30 
A1.2 
Fabian Sommer 
More Realistic Hyperbolic Random Graphs
Hyperbolic random graphs are a model of generating random graphs by drawing points from a hyperbolic disk. These graphs have a small diameter, high clustering and powerlaw degree distribution. Therefore, they are similar to many real world graphs such as social networks.
However, this resemblance is not perfect. For example, all nodes that are close to the center of the disk form a very large clique. Even with temperature, these nodes still form a very dense subgraph. Another property of hyperbolic random graphs is that after removing all central nodes, the remaining graph forms a ring which has a low pathwidth. While this matches the behavior of many realworld networks, there are some that do not conform to this.
In this talk, we explore different measurements that could reflect these differences and possible approaches to generate more realistic graphs.

27.05. 11:00 
HE.52 
Karl Bringmann 
Approximating APSP without Scaling
All Pairs Shortest Paths can be solved exactly in \(\mathcal{O}(n^3)\) time, and no much better algorithms are known on dense graphs. Relaxing the goal, Zwick showed how to compute a \((1+\varepsilon)\)approximation by utilizing fast matrix multiplication. This algorithm be used to approximate several graph characteristics including the diameter, radius, median, minimumweight triangle, and minimumweight cycle.
For integer edge weights bounded by \(W\), Zwick's running time involves a factor \(\log(W)\). Here we study whether this factor can be avoided, that is, whether APSP and related problems admit "strongly polynomial" approximation schemes, whose running time is independent of \(W\).
We remove the \(\log(W)\)factor from Zwick's running time for APSP on undirected graph as well as for several graph characteristics on directed or undirected graphs.
For APSP on directed graphs, we design a strongly polynomial approximation scheme with a worse dependence on n compared to Zwick's. We also explain why: Any improvement over our exponent would improve the best known algorithm for MinMaxProduct. In fact, we prove that approximating directed APSP and exactly computing the MinMaxProduct are equivalent.
Joint work with Marvin Künnemann and Karol Wegrzycki.

12.06. 10:00 
HE.52 
Elias J. Schneider 
Time optimization of steered swimmer's trajectories in a potential landscape
The research into active matter and microswimmers in particular, is getting increasingly
important due to the growing development of microtechnology. For certain
tasks it may be useful to know the fastest path of a microswimmer, such as a
bacterium, from one location to another under given boundary conditions. This
master’s thesis deals with the magnetotactic bacterium Magnetococcus marinus as
an example of steerable microswimmers. It has a constant intrinsic thrust and can be
guided by an external magnetic field. In a given twodimensional liquid environment,
the microswimmer is to be steered in a way so that he reaches from start to finish
through a given potential landscape, such as a sombreropotential, in the shortest
time. The problem is solved both through the calculus of variations and through
Qlearning, which is a method of reinforcement learning. In the Qlearningmethod,
the microswimmer uses artificial intelligence to explore the fastest route.

20.06. 13:30 
A1.1 
Hagen Echzell, Simon Krogmann 
Master Thesis Introduction Talks
Hagen Echzell: Flowbased Network Creation Games
Simon Krogmann: Facility Location Games on Networks

21.06. 13:30 
A1.2 
Sören Tietböhl 
Master Thesis Introduction Talk
VSIDS for (1+1) EA Mutation Operators

26.06. 13:30 
A1.2 
Ralf Rothenberger 
The Satisfiability Threshold for NonUniform Random 2SAT
Propositional satisfiability (SAT) is one of the most fundamental problems in computer science.
Its worstcase hardness lies at the core of computational complexity theory, for example in the form of NPhardness and the (Strong) Exponential Time Hypothesis.
In practice however, SAT instances can often be solved efficiently.
This contradicting behavior has spawned interest in the averagecase analysis of SAT and has triggered the development of sophisticated rigorous and nonrigorous
techniques for analyzing random structures.
Despite a long line of research and substantial progress, most theoretical work on random SAT assumes a uniform distribution on the variables.
In contrast, realworld instances often exhibit large fluctuations in variable occurrence.
This can be modeled by a nonuniform distribution of the variables, which can result in distributions closer to industrial SAT instances.
We study satisfiability thresholds of nonuniform random 2SAT with \(n\) variables and \(m\) clauses and with an arbitrary probability distribution \( (p_i)_{i\in[n]}\) with \(p_1≥p_2≥⋯≥p_n>0\) over the \(n\) variables.
We show for \(p_1^2=\Theta(\sum_{i=1}^n p_i^2)\) that the asymptotic satisfiability threshold is at
\(m=\Theta((1\sum_{i=1}^n p_i^2)/(p_1\cdot(\sum_{i=2}^n p_i^2)^{1/2}))\) and that it is coarse.
For \(p_1^2=o(\sum_{i=1}^n p_i^2)\) we show that there is a sharp satisfiability threshold at \(m=(\sum_{i=1}^n p_i^2)^{−1}\).
This result generalizes the seminal works by Chvatal and Reed [FOCS 1992] and by Goerdt [JCSS 1996].

27.06. 13:30 
A1.2 
Vanja Doskoč 
Better Cautious than Sorry  Or not?: Cautiousness in Algorithmic Learning Theory
2, 4, 8, 16, 32,... What is the next element in this sequence? What is the underlying set?
Confronted with the latter question, the odd natural numbers probably would not come to mind
as a legitimate answer immediately. This arises from a seemingly natural desire for consistency.
However, the question is whether such a desire restricts one's actual learning power? Naturally,
one can ask the same question for various other learning strategies.
We investigate how different forms of overgeneralization known from literature interact and whether
they restrain learning power.

03.07. 13:30 
A1.2 
Frank Neumann 
Evolutionary Diversity Optimisation
Evolutionary diversity optimization aims to compute a diverse set of solutions where all solutions meet a given quality criterion.
In this talk, I will give an overview on work that we carried out during the last year in this area.
Furthermore, I will point out some research questions that we plan to tackle within a project funded by the Australian Research Council during 20192021.

26.07. 13:30 
A2.2 
Manuel Rizzo 
TBA 
20.08. 10:00 
A1.1 
Karen Seidel 
TBA 