ggallipoli's picture
Create README.md
d7c8a70 verified
---
language:
- en
pipeline_tag: text2text-generation
library_name: transformers
tags:
- style-transfer
- formality-transfer
---
# Text Style Transfer using CycleGANs
This repository contains the models from the paper "Self-supervised Text Style Transfer using Cycle-Consistent Adversarial Networks" (ACM TIST 2024).\
The work introduces a novel approach to Text Style Transfer using CycleGANs with sequence-level supervision and Transformer architectures.
## Available Models
### Formality transfer
#### GYAFC dataset (Family & Relationships)
| model | checkpoint |
|:----------:|:------------------------------------------------------:|
| BART base | [informal-to-formal](https://huggingface.co/ggallipoli/bart-base_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/bart-base_for2inf_family) |
| BART large | [informal-to-formal](https://huggingface.co/ggallipoli/bart-large_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/bart-large_for2inf_family) |
| T5 small | [informal-to-formal](https://huggingface.co/ggallipoli/t5-small_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/t5-small_for2inf_family) |
| T5 base | [informal-to-formal](https://huggingface.co/ggallipoli/t5-base_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/t5-base_for2inf_family) |
| T5 large | [informal-to-formal](https://huggingface.co/ggallipoli/t5-large_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/t5-large_for2inf_family) |
| BERT base | [style classifier](https://huggingface.co/ggallipoli/formality_classifier_gyafc_family) |
#### GYAFC dataset (Entertainment & Music)
| model | checkpoint |
|:----------:|:------------------------------------------------------:|
| BART base | [informal-to-formal](https://huggingface.co/ggallipoli/bart-base_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/bart-base_for2inf_music) |
| BART large | [informal-to-formal](https://huggingface.co/ggallipoli/bart-large_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/bart-large_for2inf_music) |
| T5 small | [informal-to-formal](https://huggingface.co/ggallipoli/t5-small_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/t5-small_for2inf_music) |
| T5 base | [informal-to-formal](https://huggingface.co/ggallipoli/t5-base_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/t5-base_for2inf_music) |
| T5 large | [informal-to-formal](https://huggingface.co/ggallipoli/t5-large_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/t5-large_for2inf_music) |
| BERT base | [style classifier](https://huggingface.co/ggallipoli/formality_classifier_gyafc_music) |
### Sentiment transfer
#### Yelp dataset
| model | checkpoint |
|:----------:|:------------------------------------------------------:|
| BART base | [negative-to-positive](https://huggingface.co/ggallipoli/bart-base_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/bart-base_pos2neg) |
| BART large | [negative-to-positive](https://huggingface.co/ggallipoli/bart-large_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/bart-large_pos2neg) |
| T5 small | [negative-to-positive](https://huggingface.co/ggallipoli/t5-small_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/t5-small_pos2neg) |
| T5 base | [negative-to-positive](https://huggingface.co/ggallipoli/t5-base_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/t5-base_pos2neg) |
| T5 large | [negative-to-positive](https://huggingface.co/ggallipoli/t5-large_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/t5-large_pos2neg) |
| BERT base | [style classifier](https://huggingface.co/ggallipoli/sentiment_classifier_yelp) |
## Model Description
The models implement a CycleGAN architecture for Text Style Transfer that:
- Applies self-supervision directly at sequence level
- Maintains content while transferring style attributes
- Employs pre-trained style classifiers to guide generation
- Uses Transformer-based generators and discriminators
The models achieve state-of-the-art results on both formality and sentiment transfer tasks.
## Usage
Both generators and style classifiers can be used with the Hugging Face 🤗 transformers library:
Each generator model can be loaded as:
```python
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model = AutoModelForSeq2SeqLM.from_pretrained("[GENERATOR_MODEL]")
tokenizer = AutoTokenizer.from_pretrained("[GENERATOR_MODEL]")
```
The style classifiers can be loaded as:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
classifier = AutoModelForSequenceClassification.from_pretrained("[CLASSIFIER_MODEL]")
tokenizer = AutoTokenizer.from_pretrained("[CLASSIFIER_MODEL]")
```
## Citation
For more details, you can refer to the [paper](https://dl.acm.org/doi/10.1145/3678179).
```bibtex
@article{10.1145/3678179,
author = {La Quatra, Moreno and Gallipoli, Giuseppe and Cagliero, Luca},
title = {Self-supervised Text Style Transfer Using Cycle-Consistent Adversarial Networks},
year = {2024},
issue_date = {October 2024},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {15},
number = {5},
issn = {2157-6904},
url = {https://doi.org/10.1145/3678179},
doi = {10.1145/3678179},
journal = {ACM Trans. Intell. Syst. Technol.},
month = nov,
articleno = {110},
numpages = {38},
keywords = {Text Style Transfer, Sentiment transfer, Formality transfer, Cycle-consistent Generative Adversarial Networks, Transformers}
}
```
## Code
The full implementation is available at: https://github.com/gallipoligiuseppe/TST-CycleGAN.
## License
This work is licensed under the <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>.