Description
Collaborative Distillation is a new knowledge distillation method (named Collaborative Distillation) for encoder-decoder based neural style transfer to reduce the number of convolutional filters. The main idea is underpinned by a finding that the encoder-decoder pairs construct an exclusive collaborative relationship, which is regarded as a new kind of knowledge for style transfer models.
Papers Using This Method
Noise Fusion-based Distillation Learning for Anomaly Detection in Complex Industrial Environments2025-06-19Continual Collaborative Distillation for Recommender System2024-05-29Federated Learning via Input-Output Collaborative Distillation2023-12-22Multi-label Emotion Analysis in Conversation via Multimodal Knowledge Distillation2023-10-27DevFormer: A Symmetric Transformer for Context-Aware Device Placement2022-05-26Revisiting Discriminator in GAN Compression: A Generator-discriminator Cooperative Compression Scheme2021-10-27Complementary Calibration: Boosting General Continual Learning with Collaborative Distillation and Self-Supervision2021-09-03Compressing Facial Makeup Transfer Networks by Collaborative Distillation and Kernel Decomposition2020-09-16Collaborative Distillation for Ultra-Resolution Universal Style Transfer2020-03-18