Normalizing Flows are a class of generative models that enable the transformation of a simple probability distribution, such as a standard Gaussian, into a more complex distribution through a series of invertible mappings. The key idea is to use a sequence of bijective transformations to map a simple latent variable into a target variable as follows:
This approach allows the computation of the probability density function of the target variable using the change of variables formula:
where is the density of the latent variable and the determinant term accounts for the change in volume induced by the transformations. Normalizing Flows are particularly powerful because they can model complex distributions while allowing for efficient sampling and exact likelihood computation, making them suitable for various applications in machine learning, such as density estimation and variational inference.
Start your personalized study experience with acemate today. Sign up for free and find summaries and mock exams for your university.