标签 - Knowledge Distillation
MEAL: Multi-Model Ensemble via Adversarial Learning
Info
- Conference: AAAI 2019, Oral
- Cites: 0
- Github Stars: 116
- Github Solved/Issue: 4/11
- Author:
- 实测根据github源码无法复现,且效果不及最好的单模型,拉黑作者
Some Questions
- 这个三层fc的Discriminator在学什么?
- 主要和traditional ensemble在比,完全没有和KD methods比。
- 比较了一下参数,应该是强于 One-the-fly 的。
Explaining Knowledge Distillation by Quantifying the Knowledge
Info
- Conference: CVPR 2020
- Cites: 0
- Github Stars: --
- Github Solved/Issue: --
- Author: