Recent advances in deep generative models involved learning complex nonlinear transformations from a simple distribution of independent factors to a more complex distribution matching the data distribution. However these settings necessitate the use of approximate inference scheme. We propose a framework, Nonlinear Independent Components Estimation (NICE), for learning such complex nonlinear transformations using exact log-likelihood estimation, via the change of variables formula. Using a composition of simple non-linear building blocks, we are able to make exact inference and sampling as simple as a feed-forward pass. The resulting approach yields good performances on three datasets and can be used for inpainting. Laurent Dinh is a PhD candidate at University of Montréal. He received his MSc in applied mathematics and machine learning from École Centrale de Paris and École Normale Supérieure, France. His recent work focused on approximate inference and tractable inference for deep generative models.