OptNet: Differentiable Optimization as a Layer in Neural Networks

OptNet: Differentiable Optimization as a Layer in Neural Networks

2 Dec 2021 | Brandon Amos, J. Zico Kolter
This paper introduces OptNet, a neural network architecture that integrates optimization problems, specifically quadratic programs (QPs), as individual layers within larger end-to-end trainable deep networks. These layers capture complex constraints and dependencies between hidden states, enhancing the network's ability to solve intricate tasks. The authors derive techniques for differentiating through these layers, develop an efficient GPU-based primal-dual interior point method for solving QPs, and demonstrate the effectiveness of OptNet in various applications, including signal denoising and learning the game of mini-Sudoku. The method's ability to learn hard constraints without prior knowledge of the problem rules is highlighted, showcasing its potential for solving complex inference problems in neural networks.This paper introduces OptNet, a neural network architecture that integrates optimization problems, specifically quadratic programs (QPs), as individual layers within larger end-to-end trainable deep networks. These layers capture complex constraints and dependencies between hidden states, enhancing the network's ability to solve intricate tasks. The authors derive techniques for differentiating through these layers, develop an efficient GPU-based primal-dual interior point method for solving QPs, and demonstrate the effectiveness of OptNet in various applications, including signal denoising and learning the game of mini-Sudoku. The method's ability to learn hard constraints without prior knowledge of the problem rules is highlighted, showcasing its potential for solving complex inference problems in neural networks.
Reach us at info@study.space