Want to create an interactive transcript for this episode?
Podcast: Arxiv Papers
Episode: Why Knowledge Distillation Works in Generative Models: A Minimal Working Explanation