RusEnQA
QA for Russian and Englisha based on the rugpt3xl model
About ruGPT-3 XL model
Model was trained with 512 sequence length using Deepspeed and Megatron code by SberDevices team, on 80B tokens dataset for 4 epochs. After that model was finetuned 1 epoch with sequence length 2048. Note! Model has sparse attention blocks.
Total training time was around 10 days on 256 GPUs. Final perplexity on test set is 12.05. Model parameters: 1.3B.