TemplateError: System role not supported
#15
by
luogy
- opened
The question is the same as the title.
Does Gemma-1.1 not support{"role": "system", "content": system_prompt}?
27 messages = [
28 {
29 "role": "system",
(...)
33 {"role": "user", "content": user_input},
34 ]
---> 36 prompt = pipe.tokenizer.apply_chat_template(messages,
37 tokenize=False,
38 add_generation_prompt=True)
41 outputs = pipe(prompt,
42 max_new_tokens=256,
43 do_sample=True,
44 temperature=0.7,
45 top_k=50,
46 top_p=0.95)
48 generated_outputs = outputs[0]["generated_text"]
...
File ~/mambaforge/envs/ollama/lib/python3.11/site-packages/transformers/tokenization_utils_base.py:1828, in PreTrainedTokenizerBase._compile_jinja_template.<locals>.raise_exception(message)
1827 def raise_exception(message):
-> 1828 raise TemplateError(message)
TemplateError: System role not supported
I am also having this issue
System instructions is not supported directly but you can start the prompt with the instruction "you are an expert in computer and ..." or something like that and the model will follow
Hello!
This way, the user will be able to modify the original purposes of the model. Ideally, the user should not be able to modify the model's operating principles. How to do this?
omg
This comment has been hidden