|
--- |
|
license: apache-2.0 |
|
language: |
|
- de |
|
tags: |
|
- llama |
|
- alpaca |
|
- llm |
|
- finetune |
|
- german |
|
- transformers |
|
--- |
|
|
|
# Zicklein: german 🇩🇪 finetuned instruction LLaMA |
|
|
|
Visit the Github for more information: https://github.com/avocardio/zicklein |
|
|
|
## Usage |
|
|
|
```python |
|
from peft import PeftModel |
|
from transformers import LLaMATokenizer, LLaMAForCausalLM, GenerationConfig |
|
|
|
tokenizer = LLaMATokenizer.from_pretrained("decapoda-research/llama-7b-hf") |
|
model = LLaMAForCausalLM.from_pretrained( |
|
"decapoda-research/llama-7b-hf", |
|
load_in_8bit=False, |
|
torch_dtype=torch.float16, |
|
device_map="auto", |
|
) |
|
model = PeftModel.from_pretrained(model, "avocardio/alpaca-lora-7b-german-base-52k") |
|
``` |