Get the latest Science News and Discoveries

Distillation Can Make AI Models Smaller and Cheaper


A fundamental technique lets researchers use a big, expensive model to train another model for less.

None

Get the Android app

Or read this on Wired.com

Read more on:

Photo of Models

Models

Photo of distillation

distillation

Related news:

News photo

Antarctica’s frozen heart is warming fast, and models missed it

News photo

Suite of models shows some positive effects of climate-smart Ag practices - EurekAlert!

News photo

Subliminal Learning Lets Student AI Models Learn Unexpected (and Sometimes Misaligned) Traits from Their Teachers