On Representing LPs and MILPs by Graph Neural Networks
Time: 2022-11-10
Published By: Xiaoni Tan
Speaker(s): Dr. Wotao Yin ( DAMO Academy, Alibaba)
Time: 10:10-11:10 November 14, 2022
Venue: Online
https://meeting.tencent.com/dm/wOUCg38TrM7l
Tecent Meeting ID: 622-913-335
Passcode:123456
Abstract: While mixed-integer linear programming (MILP) is NP-hard in general, practical MILP has received 100--fold speedup in the past twenty years. Still, many classes of MILPs quickly become unsolvable as their sizes increase, motivating researchers to seek new techniques to accelerate MILP solution. With deep learning, they have obtained strong empirical results, and many results were obtained by applying graph neural networks (GNNs) to making decisions in various stages of MILP solution processes.
This work discovers a fundamental limitation: there exist feasible and infeasible MILPs that no GNNs can distinguish, indicating their lacking the power to express general MILPs. Then, we show positive results:
1. GNNs don't have this limitation on linear programs (LPs)
2. by restricting the unfoldable MILPs or, alternatively, by adding random features, there exist GNNs that can reliably predict MILP feasibility, optimal objective values, and optimal solutions up to prescribed precision.
We conducted small-scale numerical experiments to validate our theoretical findings
Co-authors: Ziang Chen (Duke Math), Jialin Liu (Alibaba DAMO), Xinshang Wang (Alibaba DAMO), Jianfeng Lu (Duke Math)