Theory Seminar

Michael Dinitz: Faster Matchings via Learned Duals

Michael DinitzJohns Hopkins University
WHERE:
3725 Beyster BuildingMap
SHARE:
A recent line of research investigates how algorithms can be augmented with machine-learned predictions to overcome worst case lower bounds. This area has revealed interesting algorithmic insights into problems, with particular success in the design of competitive online algorithms. However, the question of improving algorithm running times with predictions has largely been unexplored.

We take a first step in this direction by combining the idea of machine-learned predictions with the idea of “warm-starting” primal-dual algorithms. We consider one of the most important primitives in combinatorial optimization: weighted bipartite matching and its generalization to b-matching. We identify three key challenges when using learned dual variables in a primal-dual algorithm. First, predicted duals may be infeasible, so we give an algorithm that efficiently maps predicted infeasible duals to nearby feasible solutions. Second, once the duals are feasible, they may not be optimal, so we show that they can be used to quickly find an optimal solution. Finally, such predictions are useful only if they can be learned, so we show that the problem of learning duals for matching has low sample complexity. We validate our theoretical findings through experiments on both real and synthetic data. As a result we give a rigorous, practical, and empirically effective method to compute bipartite matchings.

Appeared as an oral presentation at NeurIPS 2022.  Joint work with Sungjin Im, Thomas Lavastida, Benjamin Moseley, and Sergei Vassilvitskii

Organizer

Greg Bodwin

Organizer

Euiwoong Lee