How Distillation Makes AI Models Smaller and Cheaper

Fundamental technique lets researchers use a big, expensive “teacher” model to train a “student” model for less.