Exploring Text Generation Capabilities in Persian with Specific GPU Requirements
#14
by
reyhane12
- opened
Can these quantized models generate text in Persian?
which GPUs are needed to run these models effectively?
Hi,
I am not sure how capable the original Llama-3-70B is in other non-western languages like Arabic or Persian. I think there should be a separate fine-tuned model focusing on Persian language to improve the support and quality.
The model in 16bit requires ~140G VRAM to be loaded into multiple GPUs. (at least 2 A100 with 80G each, or 4 A100 with 40G each)
MaziyarPanahi
changed discussion status to
closed
Thanks