LexGLUE
TextsUnknownIntroduced 2021-10-03
Legal General Language Understanding Evaluation (LexGLUE) benchmark is a collection of datasets for evaluating model performance across a diverse set of legal NLU tasks in a standardized way.
Image source: https://arxiv.org/pdf/2110.00976v1.pdf