File size: 1,929 Bytes
194aa91
938f680
 
 
 
194aa91
938f680
 
 
 
 
 
194aa91
938f680
 
194aa91
938f680
 
194aa91
 
 
 
 
938f680
194aa91
 
 
 
 
 
 
 
4a68eb8
194aa91
4a68eb8
194aa91
 
 
 
 
 
4a68eb8
194aa91
938f680
 
0b28af6
 
 
 
 
 
 
 
 
 
938f680
 
 
 
 
0b28af6
938f680
 
0b28af6
b7a2941
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
---
license: afl-3.0

language: 
- yo

datasets:
- afriqa
- xlsum
- menyo20k_mt
- alpaca-gpt4
---

# Model Description
**mistral_7b_yo_instruct** is a **text generation** model in Yorùbá. 

## Intended uses & limitations
#### How to use

```python

from transformers import AutoModelForCausalLM, AutoTokenizer

model_path = "seyabde/mistral_7b_yo_instruct"

tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
    model_path,
    device_map="auto",
    torch_dtype='auto'
).eval()

# Prompt content: "Pẹlẹ o. Bawo ni o se wa?" ("Hello. How are you?")
messages = [
    {"role": "user", "content": "Pẹlẹ o. Bawo ni o se wa?"}
]

input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
output_ids = model.generate(input_ids.to('cuda'))
response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)

# Model response:
print(response)
```


#### Example outputs

```
Ilana (Instruction): '...'

mistral_7b_yo_instruct: '...'
```

#### Eval results
Coming soon

#### Limitations and bias
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains.  

#### Training data
This model is fine-tuned on 60k+ instruction-following demonstrations built from an aggregation of datasets ([AfriQA](https://huggingface.co/datasets/masakhane/afriqa), [XLSum](https://huggingface.co/datasets/csebuetnlp/xlsum), [MENYO-20k](https://huggingface.co/datasets/menyo20k_mt)), and translations of [Alpaca-gpt4](https://huggingface.co/datasets/vicgalle/alpaca-gpt4)).

### Use and safety
We emphasize that mistral_7b_yo_instruct is intended only for research purposes and is not ready to be deployed for general use, namely because we have not designed adequate safety measures.