Artificial intelligence
fromWIRED
1 week agoDistillation Can Make AI Models Smaller and Cheaper
Knowledge distillation enables smaller models to mimic larger ones efficiently and can explain DeepSeek R1's claims and the resulting industry reaction.
My name is Mark Kurtz. I was the CTO at a startup called Neural Magic. We were acquired by Red Hat end of last year, and now working under the CTO arm at Red Hat. I'm going to be talking about GenAI at scale. Essentially, what it enables, a quick overview on that, costs, and generally how to reduce the pain. Running through a little bit more of the structure, we'll go through the state of LLMs and real-world deployment trends.