gpt2-uk / README.md
malteos's picture
Update README.md
750817b
|
raw
history blame
461 Bytes
metadata
license: mit
datasets:
  - oscar
language:
  - uk
library_name: transformers
pipeline_tag: text-generation

GPT2 Ukrainian

A generative language model for the Ukrainian language follows the GPT-2 architecture (124M parameters).

  • hidden size: 768
  • number of heads: 12
  • number of layers: 12
  • seq length: 1024
  • tokens: 11238113280 (3 epochs)
  • steps: 57167

Training data

  • OSCAR
  • Wikimedia dumps

License

MIT