gpt-2-spanish / README.md
mariagrandury's picture
Update README.md
69e8e9b verified
|
raw
history blame
2.11 kB
metadata
language: es
tags:
  - text-generation
datasets:
  - oscar
widgets:
  - text: 'Érase un vez '
  - text: >-
      Frase: Esta película es muy agradable. Sentimiento: positivo Frase: Odiaba
      esta película, apesta. Sentimiento: negativo Frase: Esta película fue
      bastante mala. Sentimiento: 
license: apache-2.0

Spanish GPT-2

GPT-2 model trained from scratch on the Spanish portion of OSCAR. The model is trained with Flax and using TPUs sponsored by Google since this is part of the Flax/Jax Community Week organised by HuggingFace.

Model description

The model used for training is OpenAI's GPT-2, introduced in the paper "Language Models are Unsupervised Multitask Learners" by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei and Ilya Sutskever.

This model is available in the 🤗 Model Hub.

Training data

Spanish portion of OSCAR or Open Super-large Crawled ALMAnaCH coRpus, a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.

This corpus is available in the 🤗 Datasets library.

Team members