TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Semi-Supervised Formality Style Transfer with Consistency ...

Semi-Supervised Formality Style Transfer with Consistency Training

Ao Liu, An Wang, Naoaki Okazaki

2022-03-25ACL 2022 5Formality Style Transfer
PaperPDFCode(official)

Abstract

Formality style transfer (FST) is a task that involves paraphrasing an informal sentence into a formal one without altering its meaning. To address the data-scarcity problem of existing parallel datasets, previous studies tend to adopt a cycle-reconstruction scheme to utilize additional unlabeled data, where the FST model mainly benefits from target-side unlabeled sentences. In this work, we propose a simple yet effective semi-supervised framework to better utilize source-side unlabeled sentences based on consistency training. Specifically, our approach augments pseudo-parallel data obtained from a source-side informal sentence by enforcing the model to generate similar outputs for its perturbed version. Moreover, we empirically examined the effects of various data perturbation methods and propose effective data filtering strategies to improve our framework. Experimental results on the GYAFC benchmark demonstrate that our approach can achieve state-of-the-art results, even with less than 40% of the parallel data.

Results

TaskDatasetMetricValueModel
Text GenerationGYAFCBLEU81.37Consistency Training
Text Style TransferGYAFCBLEU81.37Consistency Training
2D Semantic SegmentationGYAFCBLEU81.37Consistency Training

Related Papers

Formality Style Transfer in Persian2024-06-02ICLEF: In-Context Learning with Expert Feedback for Explainable Style Transfer2023-09-15Reducing Sequence Length by Predicting Edit Operations with Large Language Models2023-05-19CoEdIT: Text Editing by Task-Specific Instruction Tuning2023-05-17Evaluating the Evaluation Metrics for Style Transfer: A Case Study in Multilingual Formality Transfer2021-10-20Improving Formality Style Transfer with Context-Aware Rule Injection2021-06-01Ol\'a, Bonjour, Salve! XFORMAL: A Benchmark for Multilingual Formality Style Transfer2021-06-01Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer2021-05-14