TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/FHAC at GermEval 2021: Identifying German toxic, engaging,...

FHAC at GermEval 2021: Identifying German toxic, engaging, and fact-claiming comments with ensemble learning

Tobias Bornheim, Niklas Grieger, Stephan Bialonski

2021-09-07GermEval 2021 9Ensemble LearningToxic Comment ClassificationClassification of toxic, engaging, fact-claiming comments
PaperPDFCode(official)

Abstract

The availability of language representations learned by large pretrained neural network models (such as BERT and ELECTRA) has led to improvements in many downstream Natural Language Processing tasks in recent years. Pretrained models usually differ in pretraining objectives, architectures, and datasets they are trained on which can affect downstream performance. In this contribution, we fine-tuned German BERT and German ELECTRA models to identify toxic (subtask 1), engaging (subtask 2), and fact-claiming comments (subtask 3) in Facebook data provided by the GermEval 2021 competition. We created ensembles of these models and investigated whether and how classification performance depends on the number of ensemble members and their composition. On out-of-sample data, our best ensemble achieved a macro-F1 score of 0.73 (for all subtasks), and F1 scores of 0.72, 0.70, and 0.76 for subtasks 1, 2, and 3, respectively.

Results

TaskDatasetMetricValueModel
Text ClassificationGermEval 2021 - Toxic Comments test setF171.8GBERT/GELECTRA Ensemble
ClassificationGermEval 2021 - Toxic Comments test setF171.8GBERT/GELECTRA Ensemble

Related Papers

Simulate, Refocus and Ensemble: An Attention-Refocusing Scheme for Domain Generalization2025-07-17MEL: Multi-level Ensemble Learning for Resource-Constrained Environments2025-06-25Divide, Specialize, and Route: A New Approach to Efficient Ensemble Learning2025-06-25Learning Personalized Utility Functions for Drivers in Ride-hailing Systems Using Ensemble Hypernetworks2025-06-21RocketStack: A level-aware deep recursive ensemble learning framework with exploratory feature fusion and model pruning dynamics2025-06-20A Model-Mediated Stacked Ensemble Approach for Depression Prediction Among Professionals2025-06-17ESRPCB: an Edge guided Super-Resolution model and Ensemble learning for tiny Printed Circuit Board Defect detection2025-06-16ContextRefine-CLIP for EPIC-KITCHENS-100 Multi-Instance Retrieval Challenge 20252025-06-12