Published September 11, 2023 | Version v1
Conference paper Open

Leveraging Low-resource Parallel Data for Text Style Transfer

  • 1. ROR icon Charles University

Description

Text style transfer (TST) involves transforming a text into a desired style while approximately preserving its content. The biggest challenge in TST in the general lack of parallel data. Many existing approaches rely on complex models using substantial non-parallel data, with mixed results. In this paper, we leverage a pretrained BART language model with minimal parallel data and incorporate low-resource methods such as hyperparameter tuning, data augmentation, and self-training, which have not been explored in TST. We further include novel style-based rewards in the training loss. Through extensive experiments in sentiment transfer, a sub-task of TST, we demonstrate that our simple yet effective approaches achieve well-balanced results, surpassing non-parallel approaches and highlighting the usefulness of parallel data even in small amounts.

Files

2023.inlg-main.27.pdf

Files (287.7 kB)

Name Size Download all
md5:6e565eb395bd46cc3523cc5857ba118d
287.7 kB Preview Download

Additional details

Funding

European Commission
NG-NLG - Next-Generation Natural Language Generation 101039303