Hacker News
- Efficient Transformer Knowledge Distillation: A Performance Review https://arxiv.org/abs/2311.13657 5 comments
- CoRF: Colorizing Radiance Fields Using Knowledge Distillation https://arxiv.org/abs/2309.07668 4 comments
- Distilling knowledge from neural networks to build smaller and faster models https://blog.floydhub.com/knowledge-distillation/ 16 comments
- [R] Does Knowledge Distillation Really Work? https://arxiv.org/abs/2106.05945 7 comments machinelearning
- [R] Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation https://arxiv.org/abs/2106.07849 3 comments machinelearning
- [R] Knowledge distillation: A good teacher is patient and consistent https://arxiv.org/abs/2106.05237 2 comments machinelearning
- [R][P] distillKitPlus: High Performent Knowledge Distillation for LLMs https://github.com/agokrani/distillkitplus 2 comments machinelearning