SqueezeBERT

Natural Language ProcessingIntroduced 20001 papers

Description

SqueezeBERT is an efficient architectural variant of BERT for natural language processing that uses grouped convolutions. It is much like BERT-base, but with positional feedforward connection layers implemented as convolutions, and grouped convolution for many of the layers.

Papers Using This Method