NICE: NON-LINEAR INDEPENDENT COMPONENTS ESTIMATION

NICE: NON-LINEAR INDEPENDENT COMPONENTS ESTIMATION

10 Apr 2015 | Laurent Dinh David Krueger Yoshua Bengio
The paper introduces Non-linear Independent Component Estimation (NICE), a deep learning framework for modeling complex high-dimensional densities. NICE aims to find a transformation that maps the data to a latent space where the distribution is factorized, making it easier to model. The transformation is designed to have a trivial Jacobian determinant and inverse, allowing for efficient computation of the log-likelihood and unbiased ancestral sampling. The training criterion is based on maximizing the log-likelihood, which is tractable. NICE is shown to yield good generative models on image datasets and can be used for inpainting tasks. The architecture of NICE is described in detail, including the use of coupling layers to achieve bijective transformations with tractable Jacobians. Experimental results on various datasets demonstrate the effectiveness of NICE in terms of log-likelihood and generation quality.The paper introduces Non-linear Independent Component Estimation (NICE), a deep learning framework for modeling complex high-dimensional densities. NICE aims to find a transformation that maps the data to a latent space where the distribution is factorized, making it easier to model. The transformation is designed to have a trivial Jacobian determinant and inverse, allowing for efficient computation of the log-likelihood and unbiased ancestral sampling. The training criterion is based on maximizing the log-likelihood, which is tractable. NICE is shown to yield good generative models on image datasets and can be used for inpainting tasks. The architecture of NICE is described in detail, including the use of coupling layers to achieve bijective transformations with tractable Jacobians. Experimental results on various datasets demonstrate the effectiveness of NICE in terms of log-likelihood and generation quality.
Reach us at info@study.space