|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- MarkrAI/KOpen-HQ-Hermes-2.5-60K |
|
language: |
|
- ko |
|
base_model: |
|
- meta-llama/Llama-3.2-1B-Instruct |
|
pipeline_tag: text-generation |
|
library_name: transformers |
|
--- |
|
|
|
### Use with transformers |
|
|
|
```python |
|
import torch |
|
from transformers import pipeline |
|
|
|
model_id = "vitus9988/Llama-3.2-1B-Instruct-Ko-SFT" |
|
pipe = pipeline( |
|
"text-generation", |
|
model=model_id, |
|
torch_dtype=torch.bfloat16, |
|
device_map="auto", |
|
) |
|
messages = [ |
|
{"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"}, |
|
{"role": "user", "content": "Who are you?"}, |
|
] |
|
outputs = pipe( |
|
messages, |
|
max_new_tokens=256, |
|
) |
|
print(outputs[0]["generated_text"][-1]) |
|
``` |