Videos

Large Neighborhood Search for ILPs with Relaxation-based and Contrastive-Learning-based Heuristics

Presenter
April 28, 2023
Abstract
Large Neighborhood Search (LNS) is an effective heuristic algorithm for finding high quality solutions of combinatorial optimization problems, including for large-scale Integer Linear Programs (ILPs). It starts with an initial solution to the problem and iteratively improves it by searching a large neighborhood around the current best solution. LNS relies on destroy heuristics to select neighborhoods to search in by un-assigning a subset of the variables and then re-optimizing their assignment. We focus on designing effective and efficient heuristics in LNS for integer linear programs (ILP). Local Branching (LB) is an heuristic that selects the neighborhood that leads to the largest improvement over the current solution in each iteration of LNS (the optimal subset of variables to ‘destroy’ within the Hamming ball of the incumbent solutions). LB is often slow since it needs to solve an ILP of the same size as input. First we propose LB-relaxation-based heuristics that compute as effective neighborhoods as LB but run faster and achieve state-of-the-art anytime performance on several ILP benchmarks. Next, we propose a novel ML-based LNS for ILPs, namely CL, that uses contrastive learning (CL) to learn to imitate the LB heuristic. We not only use the optimal subsets provided by LB as the expert demonstration, but also leverage intermediate solutions and perturbations to collect sets of positive and negative samples. We use graph attention networks and a richer set of features to further improve its performance. Empirically, CL outperforms state-of-the-art ML and non-ML approaches at different runtime cutoffs in terms of multiple metrics, including the primal gap, the primal integral, the best performing rate and the survival rate, and achieves good generalization performance.