Description
Teacher-Tutor-Student Knowledge Distillation is a method for image virtual try-on models. It treats fake images produced by the parser-based method as "tutor knowledge", where the artifacts can be corrected by real "teacher knowledge", which is extracted from the real person images in a self-supervised way. Other than using real images as supervisions, knowledge distillation is formulated in the try-on problem as distilling the appearance flows between the person image and the garment image, enabling the finding of dense correspondences between them to produce high-quality results.
Papers Using This Method
A Deep Knowledge Distillation framework for EEG assisted enhancement of single-lead ECG based sleep staging2021-12-14Distilling Audio-Visual Knowledge by Compositional Contrastive Learning2021-04-22Parser-Free Virtual Try-on via Distilling Appearance Flows2021-03-08Self-supervised driven consistency training for annotation efficient histopathology image analysis2021-02-07