license: mit
datasets:
- ccnet-fr
language:
- fr
tags:
- pagnol
PAGnol: An Extra-Large French Generative Model
Paper: ARXIV, ACL ANTHOLOGY
Code: GITHUB
PAGnol is a collection of large French language models, geared towards free-form text generation. With 1.5 billion parameters. PAGnol is based on the GPT architecture. PAGnol is the first language model trained by LightOn, in cooperation with the ALMAnaCH team of Inria.
These model were trained in early 2021 following the then scaling laws and using the exact same training data as the CamemBERT model trained on CCNet. We make it available for reproducibility purposes. They do not constitute the current state of the art nor are they aiming at it.
PAGnol was built by Julien Launay, E.L. Tommasone, Baptiste Pannier, François Boniface, Amélie Chatelain, Iacopo Poli, and Djamé Seddah. It is named after Marcel Pagnol (with PAG standing for pré-apprentissage génératif), and was trained on the IDRIS Jean Zay supercomputer thanks to a GENCI allocation.
The model was converted to the Hugging Face format by Wissam Antoun (ALMAnaCH's PhD student, co-supervised by Benoît Sagot and Djamé Seddah)
Usage
Using PAGnol with Huggingface
from transformers import pipeline
generator = pipeline('text-generation', model='lightonai/pagnol-xl', trust_remote_code=True)
output = generator(
"Salut PAGnol, comment ça va ?",
max_length=50,
do_sample=True,
temperature=0.7,
)[0]["generated_text"]
>>> "Très bien! Les jours d’été sont là ! Bientôt les premiers festivals..."
License
PAGnol is made available under the MIT licence: by downloading the models available below, you agree with the terms of the MIT licence agreement. Under no circumstances will LightOn and/or Inria be held responsible or liable in any way for any claims, damages, losses, expenses, costs or liabilities whatsoever (including, without limitation, any direct or indirect damages for loss of profits, business interruption or loss of information) resulting or arising directly or indirectly from your use of or inability to use PAGnol.
Available Models
lightonai/pagnol-small
: 125M parameterslightonai/pagnol-medium
: 355M parameterslightonai/pagnol-large
: 773M parameterslightonai/pagnol-xl
: 1.5B parameters
Citation
@inproceedings{launay-etal-2022-pagnol,
title = "{PAG}nol: An Extra-Large {F}rench Generative Model",
author = "Launay, Julien and
Tommasone, E.l. and
Pannier, Baptiste and
Boniface, Fran{\c{c}}ois and
Chatelain, Am{\'e}lie and
Cappelli, Alessandro and
Poli, Iacopo and
Seddah, Djam{\'e}",
editor = "Calzolari, Nicoletta and
B{\'e}chet, Fr{\'e}d{\'e}ric and
Blache, Philippe and
Choukri, Khalid and
Cieri, Christopher and
Declerck, Thierry and
Goggi, Sara and
Isahara, Hitoshi and
Maegaard, Bente and
Mariani, Joseph and
Mazo, H{\'e}l{\`e}ne and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.455",
pages = "4275--4284",
}
Contact
For research enquiries: pagnol@lighton.ai For business enquiries: customer.relations@lighton.ai