File size: 1,807 Bytes
a9d3ffb
2d1e26b
 
a9d3ffb
a50ac4a
a4a2f19
8ab76f0
 
 
a9d3ffb
 
8ab76f0
a9d3ffb
8ab76f0
a9d3ffb
 
 
 
8ab76f0
a9d3ffb
a50ac4a
8ab76f0
 
a9d3ffb
a50ac4a
a9d3ffb
 
 
8ab76f0
a9d3ffb
8ab76f0
 
a9d3ffb
8ab76f0
 
a9d3ffb
8ab76f0
 
 
 
a9d3ffb
8ab76f0
 
 
a9d3ffb
 
 
8ab76f0
 
 
 
 
 
 
 
 
 
 
 
 
 
a9d3ffb
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
---
language: eng
license: mit
library_name: transformers
pipeline_tag: translation
widget:
- text: Hi my name is Sarah
- text: Putin is the President of Russia
- text: I will send you a message on Facebook
---

# Model Card for DarijaTranslation-V1

This model translates text from English to Darija (Moroccan Arabic).


### Model Description

This model, designed for translating text from English to Darija (Moroccan Arabic), excels in handling general, everyday language such as greetings ("hi, how are you?"). It accurately translates common phrases and sentences typically encountered in informal communication.

- **Developed by:** BAKKALI AYOUB
- **Model type:** Translation
- **Language(s) (NLP):** English to Darija (Moroccan Arabic)
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** marefa-nlp/marefa-mt-en-ar

## How to Get Started with the Model

Use the code below to get started with the model:

```python
from transformers import pipeline

# Initialize the translation pipeline
pipe = pipeline("translation", model="BAKKALIAYOUB/DarijaTranslation-V1")

# Translate text
translated_text = pipe("putin is the president of russia")
print(translated_text)
```

# Training  Details
## Training Data
The training data are from atlasia/darija-translation dataset

#### Training Hyperparameters

- **Training regime**: fp16 mixed precision
- **Epochs** : 3
- **Learning rate** : 2e-5
- **Batch size** : 16
#### Speeds, Sizes, Times

- **Hardware** : GPU P100 ith 16 GB memory
- **Training**:
| Epoch   | Training Loss | Validation Loss |
|---------|---------------|-----------------|
| 1       | 0.349600      | 0.311435        |
| 2       | 0.305100      | 0.280260        |
| 3       | 0.277700      | 0.268511        |
| 4       | 0.270000      | 0.264618        |