DAGs with NO TEARS: Continuous Optimization for Structure Learning

DAGs with NO TEARS: Continuous Optimization for Structure Learning

November 6, 2018 | Xun Zheng, Bryon Aragam, Pradeep Ravikumar, and Eric P. Xing
This paper introduces a new method for learning the structure of directed acyclic graphs (DAGs) by reformulating the problem as a continuous optimization task. Traditional methods for learning DAGs rely on combinatorial optimization over discrete spaces, which is computationally challenging. Instead, the authors propose a continuous formulation that avoids this combinatorial constraint by using a smooth function to characterize acyclicity. This function, based on the matrix exponential, allows the problem to be solved efficiently using standard numerical optimization techniques. The method, called NOTEARS, uses L-BFGS and augmented Lagrangian methods to solve the optimization problem and applies ℓ₁-regularization to promote sparsity in the estimated graph. The approach is compared to existing methods such as GES, PC, and LiNGAM, and is shown to outperform them in terms of accuracy and efficiency. The method is also tested on real-world data and is found to perform well even with small sample sizes. The key contribution is the formulation of the DAG learning problem as a continuous optimization problem, which enables efficient and effective structure learning without requiring prior assumptions about the graph's structure. The method is implemented in Python and is publicly available.This paper introduces a new method for learning the structure of directed acyclic graphs (DAGs) by reformulating the problem as a continuous optimization task. Traditional methods for learning DAGs rely on combinatorial optimization over discrete spaces, which is computationally challenging. Instead, the authors propose a continuous formulation that avoids this combinatorial constraint by using a smooth function to characterize acyclicity. This function, based on the matrix exponential, allows the problem to be solved efficiently using standard numerical optimization techniques. The method, called NOTEARS, uses L-BFGS and augmented Lagrangian methods to solve the optimization problem and applies ℓ₁-regularization to promote sparsity in the estimated graph. The approach is compared to existing methods such as GES, PC, and LiNGAM, and is shown to outperform them in terms of accuracy and efficiency. The method is also tested on real-world data and is found to perform well even with small sample sizes. The key contribution is the formulation of the DAG learning problem as a continuous optimization problem, which enables efficient and effective structure learning without requiring prior assumptions about the graph's structure. The method is implemented in Python and is publicly available.
Reach us at info@study.space