license: mit | |
datasets: | |
- oscar | |
language: | |
- uk | |
library_name: transformers | |
pipeline_tag: text-generation | |
# GPT2 Ukrainian | |
A generative language model for the Ukrainian language follows the [GPT-2 architecture](https://huggingface.co/gpt2) (124M parameters). | |
- hidden size: 768 | |
- number of heads: 12 | |
- number of layers: 12 | |
- seq length: 1024 | |
- tokens: 11238113280 (3 epochs) | |
- steps: 57167 | |
## Training data | |
- OSCAR | |
- Wikimedia dumps | |
## License | |
MIT | |