IKD on DAFL
Iterative Knowledge Distillation on Data Free Learning
This project was done in APPCAIR Lab in collaboration with TCS-Research. The project aims to find a iterative method to perform Knowledge Distillation on Data Free Learning. We have come up with a modification to the current loss function for training faster.