Pinkstack commited on
Commit
7824ecb
1 Parent(s): 002e1b0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -52
README.md CHANGED
@@ -33,58 +33,6 @@ This PGAM is based on Meta Llama 3.1 8B which we've given extra roblox LuaU trai
33
  To use this model, you must use a service which supports the GGUF file format.
34
  Additionaly, it uses the llama-3.1 template.
35
 
36
- ```
37
- {{ if .Messages }}
38
- {{- if or .System .Tools }}<|start_header_id|>system<|end_header_id|>
39
- {{- if .System }}
40
-
41
- {{ .System }}
42
- {{- end }}
43
- {{- if .Tools }}
44
-
45
- You are a helpful assistant with tool calling capabilities. When you receive a tool call response, use the output to format an answer to the original use question.
46
- {{- end }}
47
- {{- end }}<|eot_id|>
48
- {{- range $i, $_ := .Messages }}
49
- {{- $last := eq (len (slice $.Messages $i)) 1 }}
50
- {{- if eq .Role "user" }}<|start_header_id|>user<|end_header_id|>
51
- {{- if and $.Tools $last }}
52
-
53
- Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.
54
-
55
- Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}. Do not use variables.
56
-
57
- {{ $.Tools }}
58
- {{- end }}
59
-
60
- {{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>
61
-
62
- {{ end }}
63
- {{- else if eq .Role "assistant" }}<|start_header_id|>assistant<|end_header_id|>
64
- {{- if .ToolCalls }}
65
-
66
- {{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}{{ end }}
67
- {{- else }}
68
-
69
- {{ .Content }}{{ if not $last }}<|eot_id|>{{ end }}
70
- {{- end }}
71
- {{- else if eq .Role "tool" }}<|start_header_id|>ipython<|end_header_id|>
72
-
73
- {{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>
74
-
75
- {{ end }}
76
- {{- end }}
77
- {{- end }}
78
- {{- else }}
79
- {{- if .System }}<|start_header_id|>system<|end_header_id|>
80
-
81
- {{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
82
-
83
- {{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
84
-
85
- {{ end }}{{ .Response }}{{ if .Response }}<|eot_id|>{{ end }}
86
- ```
87
-
88
  Highly recommended to use with a system prompt.
89
 
90
  # Extra information
 
33
  To use this model, you must use a service which supports the GGUF file format.
34
  Additionaly, it uses the llama-3.1 template.
35
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
  Highly recommended to use with a system prompt.
37
 
38
  # Extra information