I used the codes on this page Thank you to whoever provided the codes https://github.com/huggingface/transformers/tree/main/examples/pytorch/translation I changed the model language and created a dataset file using AI https://huggingface.co/datasets/sdyy/en-ar All training is on free CPU in colab an idea If the data set could be divided into smaller parts Choose a free template Several individuals who are programming enthusiasts participate and coordinate among them A small, raw, free language model can be trained on a specific type of data Python programming language or translation from one language to another Or train the free model on large data sets such as Wikipedia Instead of working hard individually, participating in training for one goal turns the smallest, raw, free models into something wonderful And this is just an idea