Edit model card

This model is an Instruction-Tuned version of Llama 3.2 180M Amharic.

How to use

Chat Format

Given the nature of the training data, the phi-2 instruct model is best suited for prompts using the chat format as follows. You can provide the prompt as a question with a generic template as follows:

<|im_start|>user
ጥያቄ?<|im_end|>
<|im_start|>assistant

For example:

<|im_start|>user
ሶስት የአፍሪካ ሀገራት ጥቀስልኝ<|im_end|>
<|im_start|>assistant

where the model generates the text after <|im_start|>assistant .

Sample inference code

First, you need to install the latest version of transformers

pip install -Uq transformers

You can use this model directly with a pipeline for text generation:

from transformers import pipeline

llama3_am = pipeline(
    "text-generation",
    model="rasyosef/Llama-3.2-180M-Amharic-Instruct",
    device_map="auto"
  )

messages = [{"role": "user", "content": "ሶስት የአፍሪካ ሀገራት ጥቀስልኝ"}]
llama3_am(messages, max_new_tokens=128, repetition_penalty=1.1, return_full_text=False)

Output:

[{'generated_text': '1. ግብፅ 2. ኢትዮጵያ 3. ኬንያ'}]
Downloads last month
617
Safetensors
Model size
180M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for rasyosef/Llama-3.2-180M-Amharic-Instruct

Finetuned
(1)
this model

Space using rasyosef/Llama-3.2-180M-Amharic-Instruct 1

Collection including rasyosef/Llama-3.2-180M-Amharic-Instruct