Leveraging Low-Resource Parallel Data for Text Style Transfer

Sourabrata Mukherjee, Ondrej Dušek

Paper

In Sessions:

INLG Oral Session 2: NLG for low-resourced settings: (Wednesday, 13:30 CEST, Sun II , Watch on Zoom , Chat on Discord )


Abstract: Text style transfer (TST) involves transforming a text into a desired style while approximately preserving its content. The biggest challenge in TST in the general lack of parallel data. Many existing approaches rely on complex models using substantial non-parallel data, with mixed results. In this paper, we leverage a pretrained BART language model with minimal parallel data and incorporate low-resource methods such as hyperparameter tuning, data augmentation, and self-training, which have not been explored in TST. We further include novel style-based rewards in the training loss. Through extensive experiments in sentiment transfer, a sub-task of TST, we demonstrate that our simple yet effective approaches achieve well-balanced results, surpassing non-parallel approaches and highlighting the usefulness of parallel data even in small amounts.