Text Generation
Transformers
PyTorch
Italian
English
mistral
conversational
text-generation-inference
Inference Endpoints
galatolo commited on
Commit
d4c8fd7
1 Parent(s): fbea1e5

Added Prompt Format section

Browse files
Files changed (1) hide show
  1. README.md +21 -0
README.md CHANGED
@@ -114,6 +114,27 @@ The model is trained on an expansive Italian Large Language Model (LLM) using sy
114
 
115
  The model has been trained for **1 epoch**, ensuring a convergence of knowledge and proficiency in handling diverse linguistic tasks.
116
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
117
  ## Getting Started 🚀
118
 
119
  You can load **cerbero-7b** (or **cerbero-7b-openchat**) using [🤗transformers](https://huggingface.co/docs/transformers/index)
 
114
 
115
  The model has been trained for **1 epoch**, ensuring a convergence of knowledge and proficiency in handling diverse linguistic tasks.
116
 
117
+ ## Prompt Format
118
+
119
+ **cerbero-7b** is trained on full conversations using the following prompt format:
120
+
121
+ ```
122
+ [|Umano|] First human message
123
+ [|Assistente|] First AI reply
124
+ [|Umano|] Second human message
125
+ [|Assistente|] Second AI reply
126
+ ```
127
+
128
+ When crafting prompts, ensure to conclude with the `[|Assistente|]` tag, signaling the AI to generate a response.
129
+ Use `[|Umano|]` as stop word.
130
+
131
+ ```
132
+ [|Umano|] Come posso distinguere un AI da un umano?
133
+ [|Assistente|]
134
+ ```
135
+
136
+ While it's possible to include a brief system message at the start of your prompt, remember that the training data for **cerbero-7b** **does not** contain such **system messages**. Hence, it's recommended to minimize or avoid including them for optimal model performance.
137
+
138
  ## Getting Started 🚀
139
 
140
  You can load **cerbero-7b** (or **cerbero-7b-openchat**) using [🤗transformers](https://huggingface.co/docs/transformers/index)