Update README.md
Browse files
README.md
CHANGED
@@ -52,12 +52,9 @@ For the XL-Sum task, we choose our best run for each model using the eval set. W
|
|
52 |
|
53 |
# FineTuning our efficient ArabicT5-49GB-Small model with Torch on 3070 laptop GPU ###
|
54 |
|
55 |
-
If you are running your code on a laptop GPU (e.g., a gaming laptop) or limited GPU memory, we recommended using our ArabicT5-49GB-Small model, which was the only model from the list that we were able to run on 3070 Laptop card with a batch size of 8. We manage to achieve an F1 score of 85.391 (slightly better than our FLAX code ) on the TyDi QA task. See the notebook below for reference :
|
56 |
-
|
57 |
[![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/ArabicT5/blob/main/ArabicT5_49GB_Small_on_3070_Laptop_GPU.ipynb)
|
58 |
|
59 |
-
|
60 |
-
|
61 |
|
62 |
|
63 |
# FineTuning our ArabicT5 model on generative and abstractive tasks with FLAX ###
|
@@ -67,6 +64,13 @@ If you are running your code on a laptop GPU (e.g., a gaming laptop) or limited
|
|
67 |
[COLAB]: https://colab.research.google.com/assets/colab-badge.svg
|
68 |
|
69 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
70 |
# Continual Pre-Training of ArabicT5 with T5x
|
71 |
if you want to continue pre-training ArabicT5 on your own data, we have uploaded the raw t5x checkpoint to this link https://huggingface.co/sultan/ArabicT5-49GB-base/blob/main/arabict5_49GB_base_t5x.tar.gz
|
72 |
We will soon share a tutorial on how you can do that for free with Kaggle TPU
|
|
|
52 |
|
53 |
# FineTuning our efficient ArabicT5-49GB-Small model with Torch on 3070 laptop GPU ###
|
54 |
|
|
|
|
|
55 |
[![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/ArabicT5/blob/main/ArabicT5_49GB_Small_on_3070_Laptop_GPU.ipynb)
|
56 |
|
57 |
+
If you are running your code on a laptop GPU (e.g., a gaming laptop) or limited GPU memory, we recommended using our ArabicT5-49GB-Small model, which was the only model from the list that we were able to run on 3070 Laptop card with a batch size of 8. We manage to achieve an F1 score of 85.391 (slightly better than our FLAX code ) on the TyDi QA task. See the notebook below for reference :
|
|
|
58 |
|
59 |
|
60 |
# FineTuning our ArabicT5 model on generative and abstractive tasks with FLAX ###
|
|
|
64 |
[COLAB]: https://colab.research.google.com/assets/colab-badge.svg
|
65 |
|
66 |
|
67 |
+
# FineTuning ArabicT5 on TPUv3-8 with free Kaggle ###
|
68 |
+
|
69 |
+
|
70 |
+
https://www.kaggle.com/code/sultanalrowili/arabict5-on-tydi-with-free-tpuv3-8-with-kaggle
|
71 |
+
|
72 |
+
|
73 |
+
|
74 |
# Continual Pre-Training of ArabicT5 with T5x
|
75 |
if you want to continue pre-training ArabicT5 on your own data, we have uploaded the raw t5x checkpoint to this link https://huggingface.co/sultan/ArabicT5-49GB-base/blob/main/arabict5_49GB_base_t5x.tar.gz
|
76 |
We will soon share a tutorial on how you can do that for free with Kaggle TPU
|