discu
Newsletters
Mentions
Extension
Pricing
Login
Sign Up
Hacker News
Distilling knowledge from neural networks to build smaller and faster models
https://blog.floydhub.com/knowledge-distillation/
16 comments
15/11/2019
Reddit
[R] Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
https://arxiv.org/abs/2106.07849
3 comments
27/7/2021
machinelearning