PyTorch
Chinese
alpaca
Chinese-Vicuna
llama
Facico commited on
Commit
3a0c041
1 Parent(s): 4efe1cf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +34 -0
README.md CHANGED
@@ -1,3 +1,37 @@
1
  ---
2
  license: apache-2.0
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ datasets:
4
+ - BelleGroup/generated_train_0.5M_CN
5
+ - JosephusCheung/GuanacoDataset
6
+ language:
7
+ - zh
8
+ tags:
9
+ - alpaca
10
+ - Chinese-Vicuna
11
+ - llama
12
  ---
13
+
14
+ This is a Chinese instruction-tuning lora checkpoint based on llama-7B from [this repo's](https://github.com/Facico/Chinese-Vicuna) work
15
+
16
+ You can use it like this:
17
+
18
+
19
+
20
+ ```python
21
+ from transformers import LlamaForCausalLM
22
+ from peft import PeftModel
23
+
24
+ model = LlamaForCausalLM.from_pretrained(
25
+ "decapoda-research/llama-7b-hf",
26
+ load_in_8bit=True,
27
+ torch_dtype=torch.float16,
28
+ device_map="auto",
29
+ )
30
+ model = PeftModel.from_pretrained(
31
+ model,
32
+ LORA_PATH, # specific checkpoint path from "Chinese-Vicuna/Chinese-Vicuna-lora-7b-belle-and-guanaco"
33
+ torch_dtype=torch.float16,
34
+ device_map={'': 0}
35
+ )
36
+ ```
37
+