TernaryBERT

Natural Language ProcessingIntroduced 20002 papers

Description

TernaryBERT is a Transformer-based model which ternarizes the weights of a pretrained BERT model to {1,0,+1}\{-1,0,+1\}, with different granularities for word embedding and weights in the Transformer layer. Instead of directly using knowledge distillation to compress a model, it is used to improve the performance of ternarized student model with the same size as the teacher model. In this way, we transfer the knowledge from the highly-accurate teacher model to the ternarized student model with smaller capacity.

Papers Using This Method