Activities per year
Abstract
Combinatorial Optimization has wide applications throughout many industries. However, in many real-life applications, some or all coefficients of the optimization problem are not known at the time of execution. In such applications, those coefficients are estimated using machine learning (ML) models. End-to-end predict-and-optimize approaches which train the ML model taking the optimization task into consideration, have received increasing attention. In case of mixed integer linear program (MILP), previous work suggested adding a quadratic regularizer term after relaxing the MILP and differentiate the KKT conditions to facilitate gradient-based learning. In this work, we propose to differentiate the homogeneous self-dual formulation of the relaxed LP, which contains more number of parameters, instead of the KKT conditions. Moreover, as our formulation contains a log-barrier term, we need not add a quadratic term to make the formulation differentiable
Original language | English |
---|---|
Number of pages | 8 |
Publication status | Published - 7 Sep 2020 |
Event | Doctoral Program of the 26th International Conference on Principles and Practice of Constraint Programming - Online Duration: 7 Sep 2020 → 11 Sep 2020 https://cp2020.a4cp.org/callfordp.html |
Conference
Conference | Doctoral Program of the 26th International Conference on Principles and Practice of Constraint Programming |
---|---|
Period | 7/09/20 → 11/09/20 |
Internet address |
Keywords
- data-driven optimization
- interior point method
- neural networks
Fingerprint Dive into the research topics of 'Differentiable Optimization Layer with an Interior Point Approach'. Together they form a unique fingerprint.
Activities
- 1 Talk or presentation at a conference
-
Differentiable Optimization Layer with an Interior Point Approach
Jayanta Mandi (Speaker), Tias Guns (Speaker)7 Sep 2020Activity: Talk or presentation › Talk or presentation at a conference
File