Files changed (1) hide show
  1. README.md +87 -0
README.md ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: qwen-research
4
+ license_link: https://huggingface.co/Qwen/Qwen2.5-3B/blob/main/LICENSE
5
+ language:
6
+ - fr
7
+ - en
8
+ pipeline_tag: text-generation
9
+ tags:
10
+ - chat
11
+ - qwen
12
+ - qwen2.5
13
+ - finetune
14
+ - french
15
+ - english
16
+ library_name: transformers
17
+ inference: false
18
+ model_creator: MaziyarPanahi
19
+ quantized_by: MaziyarPanahi
20
+ base_model: Qwen/Qwen2.5-3B
21
+ model_name: calme-3.2-baguette-3b
22
+ datasets:
23
+ - MaziyarPanahi/french_instruct_sharegpt
24
+ - MaziyarPanahi/calme-legalkit-v0.2
25
+ ---
26
+
27
+ <img src="./calme_3.png" alt="Calme-3 Models" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
28
+
29
+ # MaziyarPanahi/calme-3.2-baguette-3b
30
+
31
+ This model is an advanced iteration of the powerful `Qwen/Qwen2.5-3B`, specifically fine-tuned to enhance its capabilities in generic domains.
32
+
33
+
34
+ # ⚡ Quantized GGUF
35
+
36
+ All GGUF models are available here: [MaziyarPanahi/calme-3.2-baguette-3b-GGUF](https://huggingface.co/MaziyarPanahi/calme-3.2-baguette-3b-GGUF)
37
+
38
+
39
+ # 🏆 [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
40
+
41
+ Leaderboard 2 coming soon!
42
+
43
+
44
+ # Prompt Template
45
+
46
+ This model uses `ChatML` prompt template:
47
+
48
+ ```
49
+ <|im_start|>system
50
+ {System}
51
+ <|im_end|>
52
+ <|im_start|>user
53
+ {User}
54
+ <|im_end|>
55
+ <|im_start|>assistant
56
+ {Assistant}
57
+ ````
58
+
59
+ # How to use
60
+
61
+
62
+ ```python
63
+
64
+ # Use a pipeline as a high-level helper
65
+
66
+ from transformers import pipeline
67
+
68
+ messages = [
69
+ {"role": "user", "content": "Who are you?"},
70
+ ]
71
+ pipe = pipeline("text-generation", model="MaziyarPanahi/calme-3.2-baguette-3b")
72
+ pipe(messages)
73
+
74
+
75
+ # Load model directly
76
+
77
+ from transformers import AutoTokenizer, AutoModelForCausalLM
78
+
79
+ tokenizer = AutoTokenizer.from_pretrained("MaziyarPanahi/calme-3.2-baguette-3b")
80
+ model = AutoModelForCausalLM.from_pretrained("MaziyarPanahi/calme-3.2-baguette-3b")
81
+ ```
82
+
83
+
84
+
85
+ # Ethical Considerations
86
+
87
+ As with any large language model, users should be aware of potential biases and limitations. We recommend implementing appropriate safeguards and human oversight when deploying this model in production environments.