EEMCS-AM-MOR

I got a PhD in computer science from the University of Lübeck and a "Habilitation" from Saarland University, and I held postdoctoral positions at Saarland University and at Yale University. Since 2009, I am affiliated with the University of Twente.

Organisations

My research area is design and analysis of algorithms, in particular, smoothed and probabilistic analysis of algorithms.

Publications

Jump to: 2024 | 2023 | 2022 | 2021 | 2020

2024

Worst-Case and Smoothed Analysis of the Hartigan-Wong Method for k-Means Clustering (2024)In 41st International Symposium on Theoretical Aspects of Computer Science, STACS 2024. Article 52 (Leibniz International Proceedings in Informatics, LIPIcs; Vol. 289). Dagstuhl. Manthey, B. & van Rhijn, J.https://doi.org/10.4230/LIPIcs.STACS.2024.52

2023

Approximation Ineffectiveness of a Tour-Untangling Heuristic (2023)In Approximation and Online Algorithms - 21st International Workshop, WAOA 2023, Proceedings (pp. 1-13) (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 14297 LNCS). Springer. Manthey, B. & van Rhijn, J.https://doi.org/10.1007/978-3-031-49815-2_1Complexity of Local Search for Euclidean Clustering Problems (2023)[Working paper › Preprint]. ArXiv.org. Manthey, B., Morawietz, N., van Rhijn, J. & Sommer, F.https://doi.org/10.48550/arXiv.2312.14916Probabilistic Analysis of Optimization Problems on Sparse Random Shortest Path Metrics (2023)Algorithmica, 85(12), 3793-3815. Klootwijk, S. & Manthey, B.https://doi.org/10.1007/s00453-023-01167-3Improved Smoothed Analysis of 2-Opt for the Euclidean TSP (2023)In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Article 52 (Leibniz International Proceedings in Informatics, LIPIcs; Vol. 283). Dagstuhl. Manthey, B. & van Rhijn, J.https://doi.org/10.4230/LIPIcs.ISAAC.2023.52Approximation Ineffectiveness of a Tour-Untangling Heuristic (2023)[Working paper › Preprint]. ArXiv.org (Accepted/In press). Manthey, B. & van Rhijn, J.https://arxiv.org/abs/2302.11264Smoothed Analysis of the 2-Opt Heuristic for the TSP under Gaussian Noise (2023)[Working paper › Preprint]. ArXiv.org. Künnemann, M., Manthey, B. & Veenstra, R.https://doi.org/10.48550/arXiv.2308.00306Worst-Case and Smoothed Analysis of the Hartigan-Wong Method for k-Means Clustering (2023)[Working paper › Preprint]. ArXiv.org (Accepted/In press). Manthey, B. & van Rhijn, J.https://arxiv.org/abs/2309.10368

2020

Smoothed Analysis of Local Search (2020)In Beyond the Worst-Case Analysis of Algorithms (pp. 285-308). Cambridge University Press. Manthey, B.https://doi.org/10.1017/9781108637435.018Probabilistic Analysis of Optimization Problems on Sparse Random Shortest Path Metrics (2020)In 31st International Conference on Probabilistic, Combinatorial and Asymptotic Methods for the Analysis of Algorithms (AofA 2020). Article 19 (Leibniz International Proceedings in Informatics, LIPIcs; Vol. 159). Dagstuhl. Klootwijk, S. & Manthey, B.https://doi.org/10.4230/LIPIcs.AofA.2020.19Probabilistic analysis of facility location on random shortest path metrics (2020)[Contribution to conference › Abstract] 15th Cologne-Twente Workshop on Graphs and Combinatorial Optimization, CTW 2017. Klootwijk, S. & Manthey, B.http://ctw.uni-koeln.de/booklet.pdfSmoothed analysis of the minimum-mean cycle canceling algorithm and the network simplex algorithm (2020)Journal of graph algorithms and applications, 24(3), 397-421. Cornelissen, K. & Manthey, B.https://doi.org/10.7155/jgaa.00539

Research profiles

Current projects

Rigorous Analysis of Local Search

Large-scale optimization problems appear in many areas, ranging from engineering over scheduling to the sciences. Unfortunately, for many optimization problems it is unlikely that we can find optimal solutions efficiently. Still, in practice often quite simple local search heuristics succeed in finding close-to-optimal solutions surprisingly quickly. This is at stark contrast to their theoretically predicted performance, which is usually very poor.

A challenge in the analysis of algorithms is to understand why simple local search heuristics show such a remarkable performance. Beyond plain curiosity, the reason that we want to understand the performance of heuristics is two-fold: First, proving rigorous bounds on the performance provides stronger guarantees than experiment evaluation alone. Second, understanding precedes improving - once we understand why and when a heuristic works well and when it fails, we can exploit this to design better heuristics. Smoothed analysis is a paradigm to analyze algorithms where classical worst-case analysis fails. Although smoothed analysis is still a young field, it has proved to be a successful tool to analyze a variety of algorithms.

The goal of this project is to prove rigorous bounds on the performance of heuristics in the framework of smoothed analysis. Beyond "pure" local search algorithms, we will make another step towards rigorously analyzing algorithms used in practice by considering hybrid heuristics and metaheuristics.

Finished projects

Smoothed Analysis of Belief Propagation

Belief propagation is a heuristic approach for solving large-scale statistical inference problems. It is an easy-to-implement heuristic based on message-passing that usually converges quickly. As such, it has become very popular, and it has proved to be successful in a wide range of applications. Its success in practice, however, is at stark contrast to the lack of theoretical understanding of its performance.

Belief propagation shares this aspect with many algorithms. The reason for the discrepancy between remarkably good performance in practice and unsatisfactory theoretical understanding is often that performance is analyzed in terms of worst-case analysis; worst-case analysis is often far too pessimistic. Indeed, worst-case instances are often artificially constructed and rarely show up in applications. An adequate theoretical analysis, however, should measure the performance in terms of ``typical'' rather than "pathological" instances. To provide a more realistic analysis of algorithms, the concept of smoothed analysis has been developed: An adversary specifies an instance, and then the expected performance is measured when this instance is slightly randomly perturbed. Smoothed analysis takes into account that practical data is often noisy, e.g., due to measurement errors. Smoothed analysis has already been applied successfully to explain the performance of a variety of algorithms.

The aim of this project is a smoothed analysis of belief propagation. We will focus on the application of belief propagation to combinatorial optimization problems. The goal is to get a deeper understanding of its performance and to bridge the gap between theoretical and practical performance of belief propagation.

Framework for Random Metric Spaces

Large-scale optimization problems show up in many domains, such as engineering, scheduling, economics, but also, e.g., in the sciences. Unfortunately, finding optimal solutions within reasonable time is often impossible because the problems that have to be solved are computationally intractable. Because of this, optimization problems are nowadays often attacked using ad-hoc heuristics. Many such heuristics show a remarkable performance, but their theoretical (worst-case) performance is poor - worst-case analysis is often too pessimistic to reflect the performance observed. In order to explain the performance of heuristics, probabilistic analysis is the method of choice, where performance is analyzed with respect to random instances.

The instances of many optimization problems involve, implicitly or explicitly, a metric space. This can be physical distances, but also, e.g., costs for travel or ransportation. Up to now, however, probabilistic analysis of algorithms is almost exclusively restricted to Euclidean instances or the distances are drawn independently, disregarding the metric nature. Both approaches fall short of explaining the average-case performance of heuristics on general metric instances.

Our goal is to develop and apply a framework for random metric spaces. We develop models for random metric spaces, study their properties, and apply these findings to explain the performance of heuristics for optimization problems. With this framework, average-case analysis of heuristics becomes a more realistic benchmark and yields more conclusive insights about performance than with the traditionally used models. This pushes our understanding of heuristics forward, helps users to choose the right algorithms, and facilitates the design of better algorithms.

Address

University of Twente

Zilverling (building no. 11), room 4010
Hallenweg 19
7522 NH Enschede
Netherlands

Navigate to location

Organisations

Scan the QR code or
Download vCard