I got a PhD in computer science from the University of Lübeck and a "Habilitation" from Saarland University, and I held postdoctoral positions at Saarland University and at Yale University. Since 2009, I am affiliated with the University of Twente.
Organisations
My research area is design and analysis of algorithms, in particular, smoothed and probabilistic analysis of algorithms.
Publications
Jump to: 2024 | 2023 | 2022 | 2021 | 2020
2024
2023
2022
2021
2020
Research profiles
Affiliated study programs
Courses academic year 2024/2025
Courses in the current academic year are added at the moment they are finalised in the Osiris system. Therefore it is possible that the list is not yet complete for the whole academic year.
- 191508209 - Internship AM
- 191508309 - Final Project (combination)
- 191508409 - Final Project M-AM
- 201300042 - Limits to Computing
- 201600421 - Algorithms beyond the Worst Case
- 202200142 - Modelling and Programming 1
- 202200239 - Modelling and Programming 2
- 202300130 - Capita Selecta Applied Mathematics
- 202400669 - Capita Selecta Applied Mathematics 2
Courses academic year 2023/2024
- 191508209 - Internship AM
- 191508309 - Final Project (combination)
- 191508409 - Final Project M-AM
- 200900030 - Research of Math
- 201300042 - Limits to Computing
- 202001215 - Calculus 1 for TN
- 202001280 - Finding vs. Verifying
- 202001361 - Languages & Machines
- 202001379 - Bachelor's Assignment
- 202200142 - Modelling and Programming 1
- 202200239 - Modelling and Programming 2
- 202200395 - Nonparametric Statistics
- 202200398 - Internship AM-CS
- 202300130 - Capita Selecta Applied Mathematics
- 202300156 - Modelling 2 - AM/TCS
Current projects
Rigorous Analysis of Local Search
Large-scale optimization problems appear in many areas, ranging from engineering over scheduling to the sciences. Unfortunately, for many optimization problems it is unlikely that we can find optimal solutions efficiently. Still, in practice often quite simple local search heuristics succeed in finding close-to-optimal solutions surprisingly quickly. This is at stark contrast to their theoretically predicted performance, which is usually very poor.
A challenge in the analysis of algorithms is to understand why simple local search heuristics show such a remarkable performance. Beyond plain curiosity, the reason that we want to understand the performance of heuristics is two-fold: First, proving rigorous bounds on the performance provides stronger guarantees than experiment evaluation alone. Second, understanding precedes improving - once we understand why and when a heuristic works well and when it fails, we can exploit this to design better heuristics. Smoothed analysis is a paradigm to analyze algorithms where classical worst-case analysis fails. Although smoothed analysis is still a young field, it has proved to be a successful tool to analyze a variety of algorithms.
The goal of this project is to prove rigorous bounds on the performance of heuristics in the framework of smoothed analysis. Beyond "pure" local search algorithms, we will make another step towards rigorously analyzing algorithms used in practice by considering hybrid heuristics and metaheuristics.
Finished projects
Smoothed Analysis of Belief Propagation
Belief propagation is a heuristic approach for solving large-scale statistical inference problems. It is an easy-to-implement heuristic based on message-passing that usually converges quickly. As such, it has become very popular, and it has proved to be successful in a wide range of applications. Its success in practice, however, is at stark contrast to the lack of theoretical understanding of its performance.
Belief propagation shares this aspect with many algorithms. The reason for the discrepancy between remarkably good performance in practice and unsatisfactory theoretical understanding is often that performance is analyzed in terms of worst-case analysis; worst-case analysis is often far too pessimistic. Indeed, worst-case instances are often artificially constructed and rarely show up in applications. An adequate theoretical analysis, however, should measure the performance in terms of ``typical'' rather than "pathological" instances. To provide a more realistic analysis of algorithms, the concept of smoothed analysis has been developed: An adversary specifies an instance, and then the expected performance is measured when this instance is slightly randomly perturbed. Smoothed analysis takes into account that practical data is often noisy, e.g., due to measurement errors. Smoothed analysis has already been applied successfully to explain the performance of a variety of algorithms.
The aim of this project is a smoothed analysis of belief propagation. We will focus on the application of belief propagation to combinatorial optimization problems. The goal is to get a deeper understanding of its performance and to bridge the gap between theoretical and practical performance of belief propagation.
Framework for Random Metric Spaces
Large-scale optimization problems show up in many domains, such as engineering, scheduling, economics, but also, e.g., in the sciences. Unfortunately, finding optimal solutions within reasonable time is often impossible because the problems that have to be solved are computationally intractable. Because of this, optimization problems are nowadays often attacked using ad-hoc heuristics. Many such heuristics show a remarkable performance, but their theoretical (worst-case) performance is poor - worst-case analysis is often too pessimistic to reflect the performance observed. In order to explain the performance of heuristics, probabilistic analysis is the method of choice, where performance is analyzed with respect to random instances.
The instances of many optimization problems involve, implicitly or explicitly, a metric space. This can be physical distances, but also, e.g., costs for travel or ransportation. Up to now, however, probabilistic analysis of algorithms is almost exclusively restricted to Euclidean instances or the distances are drawn independently, disregarding the metric nature. Both approaches fall short of explaining the average-case performance of heuristics on general metric instances.
Our goal is to develop and apply a framework for random metric spaces. We develop models for random metric spaces, study their properties, and apply these findings to explain the performance of heuristics for optimization problems. With this framework, average-case analysis of heuristics becomes a more realistic benchmark and yields more conclusive insights about performance than with the traditionally used models. This pushes our understanding of heuristics forward, helps users to choose the right algorithms, and facilitates the design of better algorithms.
Address
University of Twente
Zilverling (building no. 11), room 4010
Hallenweg 19
7522 NH Enschede
Netherlands
University of Twente
Zilverling 4010
P.O. Box 217
7500 AE Enschede
Netherlands
Organisations
Download vCard