language: | |
- ms | |
# Pretrain 1.1B 4096 context length Mistral on Malaysian text | |
README at https://github.com/mesolitica/malaya/tree/5.1/pretrained-model/mistral | |
- Dataset gathered at https://github.com/malaysia-ai/dedup-text-dataset/tree/main/pretrain-llm | |
- We use Ray cluster to train on 5 nodes of 4x A100 80GB, https://github.com/malaysia-ai/jupyter-gpu/tree/main/ray | |
WandB, https://wandb.ai/mesolitica/pretrain-mistral-1.1b?workspace=user-husein-mesolitica |