March 22, 20265 min read

    Probabilistic Generative Models Overview

    CONTENTS
    This post is the first and introductory post in the series: From Probabilistic Modeling to Generative Modeling. It is my attempt to reproduce the knowledge I gained from learning about probabilistic generative models. In this entry, I break down the general concepts needed to understand five probabilistic generative models: Gaussian Mixture Model (GMM), Variational Autoencoders (VAE), Normalizing Flows (NF), Generative Adversarial Networks (GAN), and Diffusion Models (DM), which will be explained in the next posts. In this series, these models will be presented in a logical order that I find more intuitive, rather than following a strict categorization. Before starting, it is necessary to understand why generative modeling is useful.

    1 – Why Probabilistic Generative Modeling?

    To answer this question, we need to understand the difference between discriminative and generative models.
    Original image
    Original Image P(y=cat|x) = 0.95 P(y=dog|x) = 0.05
    Noisy image
    Noisy Image P(y=cat|x) = 0.1 P(y=dog|x) = 0.9
    The classifier predicts probabilities of labels given images, meaning it learns conditional probability.
    Classification algorithms summarize complex inputs into simple outputs, discarding most of the original information.

    2 – Why Probability?

    Probability is the mathematical framework used to quantify uncertainty and reason about data.

    2.1 – Bayesian vs Frequentist

    The frequentist approach focuses on long-run frequencies, while the Bayesian approach focuses on belief and uncertainty.

    2.2 – Key Rules

    Sum Rule

    \[ P(X) = P(X, Y) + P(X, \bar{Y}) \]

    Marginalization

    \[ P(X) = \sum P(X, y) \]

    Conditional Probability

    \[ P(Y|X) = \frac{P(X,Y)}{P(X)} \]

    Bayes Theorem

    \[ P(Y|X) = \frac{P(X|Y)P(Y)}{P(X)} \]

    3 – Probabilistic Modeling

    Maximum Likelihood Estimation

    \[ \hat{\theta} = \arg \max \log p(D | \theta) \]

    5 – Latent Generative Models

    Latent Variables

    Latent variables represent hidden structures in data.
    Latent variables diagram
    Latent variable models capture hidden patterns.

    6 – Hands-On: Generating Images

    Images can be treated as samples drawn from a probability distribution.

    8 – Sources

    "The key is not to prioritize what's on your schedule, but to schedule your priorities."

    — Stephen Covey

    Key Takeaways

  1. 80% of your exam marks come from roughly 20% of the syllabus.
  2. Identify high-yield topics using past papers, syllabus weighting, and senior advice.
  3. Replace passive study habits with active recall and spaced repetition.
  4. The time you save should fuel innovation — projects, content, and side businesses.
  5. Use the Focus → Filter → Execute framework to structure every study session.
  6. Related Articles

    Comments

    Leave a Reply

    Scroll to Top