MikeRoz commited on
Commit
3c750f6
·
verified ·
1 Parent(s): ea8d5de

Upload folder using huggingface_hub

Browse files
README.md ADDED
@@ -0,0 +1,129 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: llama3.1
3
+ ---
4
+
5
+ # Llama-3.1-70B-ArliAI-RPMax-v1.3
6
+
7
+ =====================================
8
+
9
+ ## RPMax Series Overview
10
+
11
+ v1.1 = [2B](https://huggingface.co/ArliAI/Gemma-2-2B-ArliAI-RPMax-v1.1) | [3.8B](https://huggingface.co/ArliAI/Phi-3.5-mini-3.8B-ArliAI-RPMax-v1.1) | [8B](https://huggingface.co/ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1) | [9B](https://huggingface.co/ArliAI/Gemma-2-9B-ArliAI-RPMax-v1.1) | [12B](https://huggingface.co/ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.1) | [20B](https://huggingface.co/ArliAI/InternLM2_5-20B-ArliAI-RPMax-v1.1) | [22B](https://huggingface.co/ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1) | [70B](https://huggingface.co/ArliAI/Llama-3.1-70B-ArliAI-RPMax-v1.1)
12
+
13
+ v1.2 = [8B](https://huggingface.co/ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.2) | [12B](https://huggingface.co/ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.2) | [70B](https://huggingface.co/ArliAI/Llama-3.1-70B-ArliAI-RPMax-v1.2)
14
+
15
+ v1.3 = [8B](https://huggingface.co/ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.3) | [32B](https://huggingface.co/ArliAI/Qwen2.5-32B-ArliAI-RPMax-v1.3) | [70B](https://huggingface.co/ArliAI/Llama-3.1-70B-ArliAI-RPMax-v1.3)
16
+
17
+ RPMax is a series of models that are trained on a diverse set of curated creative writing and RP datasets with a focus on variety and deduplication. This model is designed to be highly creative and non-repetitive by making sure no two entries in the dataset have repeated characters or situations, which makes sure the model does not latch on to a certain personality and be capable of understanding and acting appropriately to any characters or situations.
18
+
19
+ Many RPMax users mentioned that these models does not feel like any other RP models, having a different writing style and generally doesn't feel in-bred.
20
+
21
+ You can access the model at https://arliai.com and we also have a models ranking page at https://www.arliai.com/models-ranking
22
+
23
+ Ask questions in our new Discord Server https://discord.com/invite/t75KbPgwhk or on our subreddit https://www.reddit.com/r/ArliAI/
24
+
25
+ ## Model Description
26
+
27
+ Llama-3.1-70B-ArliAI-RPMax-v1.3 is a variant made from the Llama-3.1-70B-Instruct model.
28
+
29
+ Let us know what you think of the model! The different parameter versions are based on different models, so they might all behave slightly differently in their own way.
30
+
31
+ v1.3 updated models are trained with updated software and configs such as the updated transformers library that fixes the gradient checkpointing bug which should help the model learn better.
32
+ This version also uses RSLORA+ for training which helps the model learn even better.
33
+
34
+ ### Specs
35
+
36
+ * **Context Length**: 128K
37
+ * **Parameters**: 70B
38
+
39
+ ### Training Details
40
+
41
+ * **Sequence Length**: 4096
42
+ * **Training Duration**: Approximately 5 days on 2x3090Ti
43
+ * **Epochs**: 1 epoch training for minimized repetition sickness
44
+ * **RS-QLORA+**: 64-rank 64-alpha, resulting in ~2% trainable weights
45
+ * **Learning Rate**: 0.00001
46
+ * **Gradient accumulation**: Very low 32 for better learning.
47
+
48
+ ## Quantization
49
+
50
+ The model is available in quantized formats:
51
+
52
+ * **FP16**: https://huggingface.co/ArliAI/Llama-3.1-70B-ArliAI-RPMax-v1.3
53
+ * **GGUF**: https://huggingface.co/ArliAI/Llama-3.1-70B-ArliAI-RPMax-v1.3-GGUF
54
+ * **Bartowski's GGUF**: https://huggingface.co/bartowski/Llama-3.1-70B-ArliAI-RPMax-v1.3-GGUF
55
+
56
+ ## Suggested Prompt Format
57
+
58
+ Llama 3 Instruct Format
59
+
60
+ Example:
61
+ ```
62
+ <|begin_of_text|><|start_header_id|>system<|end_header_id|>
63
+
64
+ You are [character]. You have a personality of [personality description]. [Describe scenario]<|eot_id|><|start_header_id|>user<|end_header_id|>
65
+
66
+ {{ user_message_1 }}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
67
+
68
+ {{ model_answer_1 }}<|eot_id|><|start_header_id|>user<|end_header_id|>
69
+
70
+ {{ user_message_2 }}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
71
+ ```
72
+
73
+ # RPMax: Reduced repetition and higher creativity model
74
+
75
+ The goal of RPMax is to reduce repetitions and increase the models ability to creatively write in different situations presented to it. What this means is it is a model that will output responses very differently without falling into predictable tropes even in different situations.
76
+
77
+ # What is repetition and creativity?
78
+
79
+ First of all, creativity should mean the variety in output that the model is capable of creating. You should not confuse creativity with writing prose. When a model writes in a way that can be said to be pleasant like writers would write in a novel, this is not creative writing. This is just a model having a certain pleasant type of writing prose. So a model that writes nicely is not necessarily a creative model.
80
+
81
+ Repetition and creativity are essentially intertwined with each other, so if a model is repetitive then a model can also be said to be un-creative as it cannot write new things and can only repeat similar responses that it has created before. For repetition there are actually two very different forms of repetition.
82
+
83
+ **In-context repetition:** When people mention a model is repetitive, this usually mean a model that likes to repeat the same phrases in a single conversation. An example of this is when a model says that a character "flicks her hair and...." and then starts to prepend that "flicks her hair and..." into every other action that character does.
84
+
85
+ It can be said that the model is boring, but even in real people's writing it is possible that this kind of repetition could be intentional to subtly prove a point or showcase a character's traits in some scenarios. So this type of repetition is not always bad and completely discouraging a model from doing this does not always lead to improve a model's writing ability.
86
+
87
+ **Cross-context repetition:** A second arguably worse type of repetition is a model's tendency to repeat the same phrases or tropes in very different situations. An example is a model that likes to repeat the infamous "shivers down my spine" phrase in wildly different conversations that don't necessarily fit with that phrase.
88
+
89
+ This type of repetition is ALWAYS bad as it is a sign that the model has over-fitted into that style of "creative writing" that it has often seen in the training dataset. A model's tendency to have cross-context repetition is also usually visible in how a model likes to choose similar repetitive names when writing stories. Such as the infamous "elara" and "whispering woods" names.
90
+
91
+ With RPMax v1 the main goal is to create a highly creative model by reducing reducing cross-context repetition, as that is the type of repetition that follows you through different conversations. This is also a type of repetition that can be combated by making sure your dataset does not have repetitions of the same situations or characters in different example entries.
92
+
93
+ # Dataset Curation
94
+
95
+ RPMax is successful thanks to the training method and training dataset that was created for these models' fine-tuning. It contains as many open source creative writing and RP datasets that can be found (mostly from Hugging Face), from which have been curated to weed out datasets that are purely synthetic generations as they often only serve to dumb down the model and make the model learn GPT-isms (slop) rather than help.
96
+
97
+ Then Llama 3.1 8B is used to create a database of the characters and situations that are portrayed in these datasets, which is then used to de-dupe these datasets to make sure that there is only a single entry of any character or situation.
98
+
99
+ # The Golden Rule of Fine-Tuning
100
+
101
+ Unlike the initial pre-training stage where the more data you throw at it the better it becomes for the most part, the golden rule for fine-tuning models isn't quantity, but instead quality over quantity. So the dataset for RPMax is actually orders of magnitude smaller than it would be if it included repeated characters and situations in the dataset, but the end result is a model that does not feel like just another remix of any other creative writing/RP model.
102
+
103
+ # Training Parameters
104
+
105
+ RPMax's training parameters are also a different approach to other fine-tunes. The usual way is to have a low learning rate and high gradient accumulation for better loss stability, and then run multiple epochs of the training run until the loss is acceptable.
106
+
107
+ # RPMax's Unconventional Approach
108
+
109
+ RPMax, on the other hand, is only trained for one single epoch, uses a low gradient accumulation, and a higher than normal learning rate. The loss curve during training is actually unstable and jumps up and down a lot, but if you smooth it out, it is actually still steadily decreasing over time although never end up at a very low loss value. The theory is that this allows the models to learn from each individual example in the dataset much more, and by not showing the model the same example twice, it will stop the model from latching on and reinforcing a single character or story trope.
110
+
111
+ The jumping up and down of loss during training is because as the model gets trained on a new entry from the dataset, the model will have never seen a similar example before and therefore can't really predict an answer similar to the example entry. While, the relatively high end loss of 1.0 or slightly above for RPMax models is actually good because the goal was never to create a model that can output exactly like the dataset that is being used to train it. Rather to create a model that is creative enough to make up it's own style of responses.
112
+
113
+ This is different from training a model in a particular domain and needing the model to reliably be able to output like the example dataset, such as when training a model on a company's internal knowledge base.
114
+
115
+ # Difference between versions?
116
+
117
+ v1.0 had some mistakes in the training parameters, hence why not many versions of it were created.
118
+
119
+ v1.1 fixed the previous errors and is the version where many different base models were used in order to compare and figure out which models are most ideal for RPMax. The consensus is that Mistral based models were fantastic for RPMax as they are by far the most uncensored by default. On the other hand, Gemma seems to also have a quite interesting writing style, but on the other hand it had a lot of issues with running and training and the general low interest in it. Llama 3.1 based models also seem to do well, with 70B being having the lowest loss at the end of the training runs.
120
+
121
+ v1.2 was a fix of the dataset, where it was found that there are many entries that contained broken or otherwise nonsensical system prompts or messages in the example conversations. Training the model on v1.2 predictable made them better at following instructions and staying coherent.
122
+
123
+ v1.3 was not meant to be created, but due to the gradient checkpointing bug being found recently and training frameworks finally getting updated with the fix, it sounds like a good excuse to run a v1.3 of RPMax. This version is a focus on improving the training parameters, this time training was done using rsLoRA+ or rank-stabilized low rank adaptation with the addition of LoRA plus. These additions improved the models learning quite considerably, with the models all achieving lower loss than the previous iteration and outputting better quality outputs in real usage.
124
+
125
+ # Real Success?
126
+
127
+ RPMax models have been out for a few months at this point, with versions v1.0 all the way to the now new v1.3. So far it seems like RPMax have been a resounding success in that achieves it's original goal of being a new creative writing/RP model that does not write like other RP finetunes. A lot of users of it mentioned it kind of almost feels like interacting with a real person when in an RP scenario, and that it does impressively unexpected things in their stories that caught them off guard in a good way.
128
+
129
+ Is it the best model there is? Probably not, but there isn't ever one single best model. So try it out for yourself and maybe you will like it! As always any feedback on the model is always appreciated and will be taken into account for the next versions.
config.json ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "/home/owen/models/Meta-Llama-3.1-70B-Instruct",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 128000,
9
+ "eos_token_id": 128009,
10
+ "head_dim": 128,
11
+ "hidden_act": "silu",
12
+ "hidden_size": 8192,
13
+ "initializer_range": 0.02,
14
+ "intermediate_size": 28672,
15
+ "max_position_embeddings": 131072,
16
+ "mlp_bias": false,
17
+ "model_type": "llama",
18
+ "num_attention_heads": 64,
19
+ "num_hidden_layers": 80,
20
+ "num_key_value_heads": 8,
21
+ "pretraining_tp": 1,
22
+ "rms_norm_eps": 1e-05,
23
+ "rope_scaling": {
24
+ "factor": 8.0,
25
+ "high_freq_factor": 4.0,
26
+ "low_freq_factor": 1.0,
27
+ "original_max_position_embeddings": 8192,
28
+ "rope_type": "llama3"
29
+ },
30
+ "rope_theta": 500000.0,
31
+ "tie_word_embeddings": false,
32
+ "torch_dtype": "bfloat16",
33
+ "transformers_version": "4.46.1",
34
+ "use_cache": false,
35
+ "vocab_size": 128256,
36
+ "quantization_config": {
37
+ "quant_method": "exl2",
38
+ "version": "0.2.3",
39
+ "bits": 5.0,
40
+ "head_bits": 6,
41
+ "calibration": {
42
+ "rows": 115,
43
+ "length": 2048,
44
+ "dataset": "(default)"
45
+ }
46
+ }
47
+ }
generation_config.json ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 128000,
3
+ "do_sample": true,
4
+ "eos_token_id": [
5
+ 128001,
6
+ 128008,
7
+ 128009
8
+ ],
9
+ "temperature": 0.6,
10
+ "top_p": 0.9,
11
+ "transformers_version": "4.46.1"
12
+ }
huggingface-metadata.txt ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ url: https://huggingface.co/ArliAI/Llama-3.1-70B-ArliAI-RPMax-v1.3
2
+ branch: main
3
+ download date: 2024-11-22 05:01:27
4
+ sha256sum:
5
+ 5c6f244050ab65a6f3db67354918929fde311957010284ea219d1f45f8088154 model-00001-of-00030.safetensors
6
+ f4ae35138e09a0c8c45bd1b483941ab79432de821b5577911f1ed3d746d08d9e model-00002-of-00030.safetensors
7
+ 458d355bdff80261ab0b8f84ed80c335bc2bed600c490cc865fb93fe271e728e model-00003-of-00030.safetensors
8
+ 89837fb34a58bb920e584d8dd7cea2bf68cd3d71dd70f9aee4132e2e14d2f185 model-00004-of-00030.safetensors
9
+ 42326b22f6a1c7fedcdf01bb818f72cacfd599389429d1a796c8ca76398f6e03 model-00005-of-00030.safetensors
10
+ 6a835bec4a0f6d7abe46ec9a59c7bfac2011abc96957628876b7a847007a7d11 model-00006-of-00030.safetensors
11
+ df67c4bd6cd6fe72549a4e0d94067d455712b6c70f53590606cf296db2aadf11 model-00007-of-00030.safetensors
12
+ f4144feefe05288cdc28379262f0b0f60768258bf97157eb08dab5f282a410c7 model-00008-of-00030.safetensors
13
+ a5bfd7cea01e173ed03b190684b3bc1d536b727200e196a2550714fbe6bb8936 model-00009-of-00030.safetensors
14
+ 078af1617aca646b570c44de2687ba3941fd871f748e1e14a8a9157f65a6bc1e model-00010-of-00030.safetensors
15
+ 4e52fbd61bf189849d2ab94c2e8851cb1f3c4e24c23c7bdc4f8eb79705ede864 model-00011-of-00030.safetensors
16
+ a44a29db6325e1f27a3ad5f565c5617edb5d1ec20309fd59a1a272d9e0c32a20 model-00012-of-00030.safetensors
17
+ a6a01544fbef03ae25ca08110f55a79d62c749d111e1ef2a49c6649afb9abc06 model-00013-of-00030.safetensors
18
+ 98c54703728d7b5cd88bcedc5a73f8e54518cd0d7fe54a0272559c27e1b8e994 model-00014-of-00030.safetensors
19
+ 10583ee7d83a9e87440b58e2ab63ed1ef88f935bb8c44a9a3e5a26a9a3dde967 model-00015-of-00030.safetensors
20
+ 1b32b71b718e3a58943309c9397ae25180ab6ef8015fbf6795e6f7352ddbee76 model-00016-of-00030.safetensors
21
+ 118c7b2881ade7b66551ac1fc50478839949372894021b51034fa7d798c9a0d2 model-00017-of-00030.safetensors
22
+ adb8caf5db5b689339c28ec53da39cb1b36833d27fd1e6ae90a2739f2f6b6dc5 model-00018-of-00030.safetensors
23
+ 159694a32e94c5892c72746e022f42416620a213097b10dd34f56d51761ac8c1 model-00019-of-00030.safetensors
24
+ 267cc228b49655f7d27602678c173dccb1914825c61fd6786fbfc340d2520f09 model-00020-of-00030.safetensors
25
+ 3ae9d507dbf02ee64215a31865ab7c6cae06bb8ef46ea47c41acf4feeeab5a76 model-00021-of-00030.safetensors
26
+ f5c39f5bb9a333a4499386a9a8d67108a14b01b353921e6d76c892c8404ada65 model-00022-of-00030.safetensors
27
+ ebf78ebbf1365f7df5d47fa75ec86d710b84f9462216db785e64f5b53cd6d7e5 model-00023-of-00030.safetensors
28
+ 94ca9559eb0d73a2a646f69ff20c4f8e7e44faf0356d2748c80b11c292fc6916 model-00024-of-00030.safetensors
29
+ b04fdbef901d0202bcb4230b24846f7b4fdb2ed473758f181bcf0ee95633fdf0 model-00025-of-00030.safetensors
30
+ e05f4944a310de9ac95a0954080ae3dd0672024803daf70472603e10d42821a9 model-00026-of-00030.safetensors
31
+ 4746e3b5d0e8fe2c1039e4df0bb9f0f9fe14e3753528e9150565a95f14136c11 model-00027-of-00030.safetensors
32
+ f2a7e7facca2d495b9b0cf8557dfe00c8f871564a3284ab7a5b043d57ac43b9a model-00028-of-00030.safetensors
33
+ e034783e81bf6a362e26da4ba3491ebebe383fcaafaf139f36d50441355e9fea model-00029-of-00030.safetensors
34
+ c845d7ccba30e2e0eb8ab20aa313d5ed45dca6634e214a62ed323b70462fa828 model-00030-of-00030.safetensors
measurement.json ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors.index.json ADDED
@@ -0,0 +1,730 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 141107412992
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00030-of-00030.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00030.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00030.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00030.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00030.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00030.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00030.safetensors",
13
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00030.safetensors",
14
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00030.safetensors",
15
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00030.safetensors",
16
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00030.safetensors",
17
+ "model.layers.1.input_layernorm.weight": "model-00002-of-00030.safetensors",
18
+ "model.layers.1.mlp.down_proj.weight": "model-00002-of-00030.safetensors",
19
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00030.safetensors",
20
+ "model.layers.1.mlp.up_proj.weight": "model-00002-of-00030.safetensors",
21
+ "model.layers.1.post_attention_layernorm.weight": "model-00002-of-00030.safetensors",
22
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00030.safetensors",
23
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00030.safetensors",
24
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00030.safetensors",
25
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00030.safetensors",
26
+ "model.layers.10.input_layernorm.weight": "model-00005-of-00030.safetensors",
27
+ "model.layers.10.mlp.down_proj.weight": "model-00005-of-00030.safetensors",
28
+ "model.layers.10.mlp.gate_proj.weight": "model-00005-of-00030.safetensors",
29
+ "model.layers.10.mlp.up_proj.weight": "model-00005-of-00030.safetensors",
30
+ "model.layers.10.post_attention_layernorm.weight": "model-00005-of-00030.safetensors",
31
+ "model.layers.10.self_attn.k_proj.weight": "model-00005-of-00030.safetensors",
32
+ "model.layers.10.self_attn.o_proj.weight": "model-00005-of-00030.safetensors",
33
+ "model.layers.10.self_attn.q_proj.weight": "model-00005-of-00030.safetensors",
34
+ "model.layers.10.self_attn.v_proj.weight": "model-00005-of-00030.safetensors",
35
+ "model.layers.11.input_layernorm.weight": "model-00005-of-00030.safetensors",
36
+ "model.layers.11.mlp.down_proj.weight": "model-00005-of-00030.safetensors",
37
+ "model.layers.11.mlp.gate_proj.weight": "model-00005-of-00030.safetensors",
38
+ "model.layers.11.mlp.up_proj.weight": "model-00005-of-00030.safetensors",
39
+ "model.layers.11.post_attention_layernorm.weight": "model-00005-of-00030.safetensors",
40
+ "model.layers.11.self_attn.k_proj.weight": "model-00005-of-00030.safetensors",
41
+ "model.layers.11.self_attn.o_proj.weight": "model-00005-of-00030.safetensors",
42
+ "model.layers.11.self_attn.q_proj.weight": "model-00005-of-00030.safetensors",
43
+ "model.layers.11.self_attn.v_proj.weight": "model-00005-of-00030.safetensors",
44
+ "model.layers.12.input_layernorm.weight": "model-00006-of-00030.safetensors",
45
+ "model.layers.12.mlp.down_proj.weight": "model-00006-of-00030.safetensors",
46
+ "model.layers.12.mlp.gate_proj.weight": "model-00005-of-00030.safetensors",
47
+ "model.layers.12.mlp.up_proj.weight": "model-00005-of-00030.safetensors",
48
+ "model.layers.12.post_attention_layernorm.weight": "model-00006-of-00030.safetensors",
49
+ "model.layers.12.self_attn.k_proj.weight": "model-00005-of-00030.safetensors",
50
+ "model.layers.12.self_attn.o_proj.weight": "model-00005-of-00030.safetensors",
51
+ "model.layers.12.self_attn.q_proj.weight": "model-00005-of-00030.safetensors",
52
+ "model.layers.12.self_attn.v_proj.weight": "model-00005-of-00030.safetensors",
53
+ "model.layers.13.input_layernorm.weight": "model-00006-of-00030.safetensors",
54
+ "model.layers.13.mlp.down_proj.weight": "model-00006-of-00030.safetensors",
55
+ "model.layers.13.mlp.gate_proj.weight": "model-00006-of-00030.safetensors",
56
+ "model.layers.13.mlp.up_proj.weight": "model-00006-of-00030.safetensors",
57
+ "model.layers.13.post_attention_layernorm.weight": "model-00006-of-00030.safetensors",
58
+ "model.layers.13.self_attn.k_proj.weight": "model-00006-of-00030.safetensors",
59
+ "model.layers.13.self_attn.o_proj.weight": "model-00006-of-00030.safetensors",
60
+ "model.layers.13.self_attn.q_proj.weight": "model-00006-of-00030.safetensors",
61
+ "model.layers.13.self_attn.v_proj.weight": "model-00006-of-00030.safetensors",
62
+ "model.layers.14.input_layernorm.weight": "model-00006-of-00030.safetensors",
63
+ "model.layers.14.mlp.down_proj.weight": "model-00006-of-00030.safetensors",
64
+ "model.layers.14.mlp.gate_proj.weight": "model-00006-of-00030.safetensors",
65
+ "model.layers.14.mlp.up_proj.weight": "model-00006-of-00030.safetensors",
66
+ "model.layers.14.post_attention_layernorm.weight": "model-00006-of-00030.safetensors",
67
+ "model.layers.14.self_attn.k_proj.weight": "model-00006-of-00030.safetensors",
68
+ "model.layers.14.self_attn.o_proj.weight": "model-00006-of-00030.safetensors",
69
+ "model.layers.14.self_attn.q_proj.weight": "model-00006-of-00030.safetensors",
70
+ "model.layers.14.self_attn.v_proj.weight": "model-00006-of-00030.safetensors",
71
+ "model.layers.15.input_layernorm.weight": "model-00007-of-00030.safetensors",
72
+ "model.layers.15.mlp.down_proj.weight": "model-00007-of-00030.safetensors",
73
+ "model.layers.15.mlp.gate_proj.weight": "model-00006-of-00030.safetensors",
74
+ "model.layers.15.mlp.up_proj.weight": "model-00007-of-00030.safetensors",
75
+ "model.layers.15.post_attention_layernorm.weight": "model-00007-of-00030.safetensors",
76
+ "model.layers.15.self_attn.k_proj.weight": "model-00006-of-00030.safetensors",
77
+ "model.layers.15.self_attn.o_proj.weight": "model-00006-of-00030.safetensors",
78
+ "model.layers.15.self_attn.q_proj.weight": "model-00006-of-00030.safetensors",
79
+ "model.layers.15.self_attn.v_proj.weight": "model-00006-of-00030.safetensors",
80
+ "model.layers.16.input_layernorm.weight": "model-00007-of-00030.safetensors",
81
+ "model.layers.16.mlp.down_proj.weight": "model-00007-of-00030.safetensors",
82
+ "model.layers.16.mlp.gate_proj.weight": "model-00007-of-00030.safetensors",
83
+ "model.layers.16.mlp.up_proj.weight": "model-00007-of-00030.safetensors",
84
+ "model.layers.16.post_attention_layernorm.weight": "model-00007-of-00030.safetensors",
85
+ "model.layers.16.self_attn.k_proj.weight": "model-00007-of-00030.safetensors",
86
+ "model.layers.16.self_attn.o_proj.weight": "model-00007-of-00030.safetensors",
87
+ "model.layers.16.self_attn.q_proj.weight": "model-00007-of-00030.safetensors",
88
+ "model.layers.16.self_attn.v_proj.weight": "model-00007-of-00030.safetensors",
89
+ "model.layers.17.input_layernorm.weight": "model-00007-of-00030.safetensors",
90
+ "model.layers.17.mlp.down_proj.weight": "model-00007-of-00030.safetensors",
91
+ "model.layers.17.mlp.gate_proj.weight": "model-00007-of-00030.safetensors",
92
+ "model.layers.17.mlp.up_proj.weight": "model-00007-of-00030.safetensors",
93
+ "model.layers.17.post_attention_layernorm.weight": "model-00007-of-00030.safetensors",
94
+ "model.layers.17.self_attn.k_proj.weight": "model-00007-of-00030.safetensors",
95
+ "model.layers.17.self_attn.o_proj.weight": "model-00007-of-00030.safetensors",
96
+ "model.layers.17.self_attn.q_proj.weight": "model-00007-of-00030.safetensors",
97
+ "model.layers.17.self_attn.v_proj.weight": "model-00007-of-00030.safetensors",
98
+ "model.layers.18.input_layernorm.weight": "model-00008-of-00030.safetensors",
99
+ "model.layers.18.mlp.down_proj.weight": "model-00008-of-00030.safetensors",
100
+ "model.layers.18.mlp.gate_proj.weight": "model-00008-of-00030.safetensors",
101
+ "model.layers.18.mlp.up_proj.weight": "model-00008-of-00030.safetensors",
102
+ "model.layers.18.post_attention_layernorm.weight": "model-00008-of-00030.safetensors",
103
+ "model.layers.18.self_attn.k_proj.weight": "model-00007-of-00030.safetensors",
104
+ "model.layers.18.self_attn.o_proj.weight": "model-00007-of-00030.safetensors",
105
+ "model.layers.18.self_attn.q_proj.weight": "model-00007-of-00030.safetensors",
106
+ "model.layers.18.self_attn.v_proj.weight": "model-00007-of-00030.safetensors",
107
+ "model.layers.19.input_layernorm.weight": "model-00008-of-00030.safetensors",
108
+ "model.layers.19.mlp.down_proj.weight": "model-00008-of-00030.safetensors",
109
+ "model.layers.19.mlp.gate_proj.weight": "model-00008-of-00030.safetensors",
110
+ "model.layers.19.mlp.up_proj.weight": "model-00008-of-00030.safetensors",
111
+ "model.layers.19.post_attention_layernorm.weight": "model-00008-of-00030.safetensors",
112
+ "model.layers.19.self_attn.k_proj.weight": "model-00008-of-00030.safetensors",
113
+ "model.layers.19.self_attn.o_proj.weight": "model-00008-of-00030.safetensors",
114
+ "model.layers.19.self_attn.q_proj.weight": "model-00008-of-00030.safetensors",
115
+ "model.layers.19.self_attn.v_proj.weight": "model-00008-of-00030.safetensors",
116
+ "model.layers.2.input_layernorm.weight": "model-00002-of-00030.safetensors",
117
+ "model.layers.2.mlp.down_proj.weight": "model-00002-of-00030.safetensors",
118
+ "model.layers.2.mlp.gate_proj.weight": "model-00002-of-00030.safetensors",
119
+ "model.layers.2.mlp.up_proj.weight": "model-00002-of-00030.safetensors",
120
+ "model.layers.2.post_attention_layernorm.weight": "model-00002-of-00030.safetensors",
121
+ "model.layers.2.self_attn.k_proj.weight": "model-00002-of-00030.safetensors",
122
+ "model.layers.2.self_attn.o_proj.weight": "model-00002-of-00030.safetensors",
123
+ "model.layers.2.self_attn.q_proj.weight": "model-00002-of-00030.safetensors",
124
+ "model.layers.2.self_attn.v_proj.weight": "model-00002-of-00030.safetensors",
125
+ "model.layers.20.input_layernorm.weight": "model-00008-of-00030.safetensors",
126
+ "model.layers.20.mlp.down_proj.weight": "model-00008-of-00030.safetensors",
127
+ "model.layers.20.mlp.gate_proj.weight": "model-00008-of-00030.safetensors",
128
+ "model.layers.20.mlp.up_proj.weight": "model-00008-of-00030.safetensors",
129
+ "model.layers.20.post_attention_layernorm.weight": "model-00008-of-00030.safetensors",
130
+ "model.layers.20.self_attn.k_proj.weight": "model-00008-of-00030.safetensors",
131
+ "model.layers.20.self_attn.o_proj.weight": "model-00008-of-00030.safetensors",
132
+ "model.layers.20.self_attn.q_proj.weight": "model-00008-of-00030.safetensors",
133
+ "model.layers.20.self_attn.v_proj.weight": "model-00008-of-00030.safetensors",
134
+ "model.layers.21.input_layernorm.weight": "model-00009-of-00030.safetensors",
135
+ "model.layers.21.mlp.down_proj.weight": "model-00009-of-00030.safetensors",
136
+ "model.layers.21.mlp.gate_proj.weight": "model-00009-of-00030.safetensors",
137
+ "model.layers.21.mlp.up_proj.weight": "model-00009-of-00030.safetensors",
138
+ "model.layers.21.post_attention_layernorm.weight": "model-00009-of-00030.safetensors",
139
+ "model.layers.21.self_attn.k_proj.weight": "model-00008-of-00030.safetensors",
140
+ "model.layers.21.self_attn.o_proj.weight": "model-00009-of-00030.safetensors",
141
+ "model.layers.21.self_attn.q_proj.weight": "model-00008-of-00030.safetensors",
142
+ "model.layers.21.self_attn.v_proj.weight": "model-00008-of-00030.safetensors",
143
+ "model.layers.22.input_layernorm.weight": "model-00009-of-00030.safetensors",
144
+ "model.layers.22.mlp.down_proj.weight": "model-00009-of-00030.safetensors",
145
+ "model.layers.22.mlp.gate_proj.weight": "model-00009-of-00030.safetensors",
146
+ "model.layers.22.mlp.up_proj.weight": "model-00009-of-00030.safetensors",
147
+ "model.layers.22.post_attention_layernorm.weight": "model-00009-of-00030.safetensors",
148
+ "model.layers.22.self_attn.k_proj.weight": "model-00009-of-00030.safetensors",
149
+ "model.layers.22.self_attn.o_proj.weight": "model-00009-of-00030.safetensors",
150
+ "model.layers.22.self_attn.q_proj.weight": "model-00009-of-00030.safetensors",
151
+ "model.layers.22.self_attn.v_proj.weight": "model-00009-of-00030.safetensors",
152
+ "model.layers.23.input_layernorm.weight": "model-00009-of-00030.safetensors",
153
+ "model.layers.23.mlp.down_proj.weight": "model-00009-of-00030.safetensors",
154
+ "model.layers.23.mlp.gate_proj.weight": "model-00009-of-00030.safetensors",
155
+ "model.layers.23.mlp.up_proj.weight": "model-00009-of-00030.safetensors",
156
+ "model.layers.23.post_attention_layernorm.weight": "model-00009-of-00030.safetensors",
157
+ "model.layers.23.self_attn.k_proj.weight": "model-00009-of-00030.safetensors",
158
+ "model.layers.23.self_attn.o_proj.weight": "model-00009-of-00030.safetensors",
159
+ "model.layers.23.self_attn.q_proj.weight": "model-00009-of-00030.safetensors",
160
+ "model.layers.23.self_attn.v_proj.weight": "model-00009-of-00030.safetensors",
161
+ "model.layers.24.input_layernorm.weight": "model-00010-of-00030.safetensors",
162
+ "model.layers.24.mlp.down_proj.weight": "model-00010-of-00030.safetensors",
163
+ "model.layers.24.mlp.gate_proj.weight": "model-00010-of-00030.safetensors",
164
+ "model.layers.24.mlp.up_proj.weight": "model-00010-of-00030.safetensors",
165
+ "model.layers.24.post_attention_layernorm.weight": "model-00010-of-00030.safetensors",
166
+ "model.layers.24.self_attn.k_proj.weight": "model-00010-of-00030.safetensors",
167
+ "model.layers.24.self_attn.o_proj.weight": "model-00010-of-00030.safetensors",
168
+ "model.layers.24.self_attn.q_proj.weight": "model-00010-of-00030.safetensors",
169
+ "model.layers.24.self_attn.v_proj.weight": "model-00010-of-00030.safetensors",
170
+ "model.layers.25.input_layernorm.weight": "model-00010-of-00030.safetensors",
171
+ "model.layers.25.mlp.down_proj.weight": "model-00010-of-00030.safetensors",
172
+ "model.layers.25.mlp.gate_proj.weight": "model-00010-of-00030.safetensors",
173
+ "model.layers.25.mlp.up_proj.weight": "model-00010-of-00030.safetensors",
174
+ "model.layers.25.post_attention_layernorm.weight": "model-00010-of-00030.safetensors",
175
+ "model.layers.25.self_attn.k_proj.weight": "model-00010-of-00030.safetensors",
176
+ "model.layers.25.self_attn.o_proj.weight": "model-00010-of-00030.safetensors",
177
+ "model.layers.25.self_attn.q_proj.weight": "model-00010-of-00030.safetensors",
178
+ "model.layers.25.self_attn.v_proj.weight": "model-00010-of-00030.safetensors",
179
+ "model.layers.26.input_layernorm.weight": "model-00011-of-00030.safetensors",
180
+ "model.layers.26.mlp.down_proj.weight": "model-00011-of-00030.safetensors",
181
+ "model.layers.26.mlp.gate_proj.weight": "model-00010-of-00030.safetensors",
182
+ "model.layers.26.mlp.up_proj.weight": "model-00010-of-00030.safetensors",
183
+ "model.layers.26.post_attention_layernorm.weight": "model-00011-of-00030.safetensors",
184
+ "model.layers.26.self_attn.k_proj.weight": "model-00010-of-00030.safetensors",
185
+ "model.layers.26.self_attn.o_proj.weight": "model-00010-of-00030.safetensors",
186
+ "model.layers.26.self_attn.q_proj.weight": "model-00010-of-00030.safetensors",
187
+ "model.layers.26.self_attn.v_proj.weight": "model-00010-of-00030.safetensors",
188
+ "model.layers.27.input_layernorm.weight": "model-00011-of-00030.safetensors",
189
+ "model.layers.27.mlp.down_proj.weight": "model-00011-of-00030.safetensors",
190
+ "model.layers.27.mlp.gate_proj.weight": "model-00011-of-00030.safetensors",
191
+ "model.layers.27.mlp.up_proj.weight": "model-00011-of-00030.safetensors",
192
+ "model.layers.27.post_attention_layernorm.weight": "model-00011-of-00030.safetensors",
193
+ "model.layers.27.self_attn.k_proj.weight": "model-00011-of-00030.safetensors",
194
+ "model.layers.27.self_attn.o_proj.weight": "model-00011-of-00030.safetensors",
195
+ "model.layers.27.self_attn.q_proj.weight": "model-00011-of-00030.safetensors",
196
+ "model.layers.27.self_attn.v_proj.weight": "model-00011-of-00030.safetensors",
197
+ "model.layers.28.input_layernorm.weight": "model-00011-of-00030.safetensors",
198
+ "model.layers.28.mlp.down_proj.weight": "model-00011-of-00030.safetensors",
199
+ "model.layers.28.mlp.gate_proj.weight": "model-00011-of-00030.safetensors",
200
+ "model.layers.28.mlp.up_proj.weight": "model-00011-of-00030.safetensors",
201
+ "model.layers.28.post_attention_layernorm.weight": "model-00011-of-00030.safetensors",
202
+ "model.layers.28.self_attn.k_proj.weight": "model-00011-of-00030.safetensors",
203
+ "model.layers.28.self_attn.o_proj.weight": "model-00011-of-00030.safetensors",
204
+ "model.layers.28.self_attn.q_proj.weight": "model-00011-of-00030.safetensors",
205
+ "model.layers.28.self_attn.v_proj.weight": "model-00011-of-00030.safetensors",
206
+ "model.layers.29.input_layernorm.weight": "model-00012-of-00030.safetensors",
207
+ "model.layers.29.mlp.down_proj.weight": "model-00012-of-00030.safetensors",
208
+ "model.layers.29.mlp.gate_proj.weight": "model-00011-of-00030.safetensors",
209
+ "model.layers.29.mlp.up_proj.weight": "model-00012-of-00030.safetensors",
210
+ "model.layers.29.post_attention_layernorm.weight": "model-00012-of-00030.safetensors",
211
+ "model.layers.29.self_attn.k_proj.weight": "model-00011-of-00030.safetensors",
212
+ "model.layers.29.self_attn.o_proj.weight": "model-00011-of-00030.safetensors",
213
+ "model.layers.29.self_attn.q_proj.weight": "model-00011-of-00030.safetensors",
214
+ "model.layers.29.self_attn.v_proj.weight": "model-00011-of-00030.safetensors",
215
+ "model.layers.3.input_layernorm.weight": "model-00002-of-00030.safetensors",
216
+ "model.layers.3.mlp.down_proj.weight": "model-00002-of-00030.safetensors",
217
+ "model.layers.3.mlp.gate_proj.weight": "model-00002-of-00030.safetensors",
218
+ "model.layers.3.mlp.up_proj.weight": "model-00002-of-00030.safetensors",
219
+ "model.layers.3.post_attention_layernorm.weight": "model-00002-of-00030.safetensors",
220
+ "model.layers.3.self_attn.k_proj.weight": "model-00002-of-00030.safetensors",
221
+ "model.layers.3.self_attn.o_proj.weight": "model-00002-of-00030.safetensors",
222
+ "model.layers.3.self_attn.q_proj.weight": "model-00002-of-00030.safetensors",
223
+ "model.layers.3.self_attn.v_proj.weight": "model-00002-of-00030.safetensors",
224
+ "model.layers.30.input_layernorm.weight": "model-00012-of-00030.safetensors",
225
+ "model.layers.30.mlp.down_proj.weight": "model-00012-of-00030.safetensors",
226
+ "model.layers.30.mlp.gate_proj.weight": "model-00012-of-00030.safetensors",
227
+ "model.layers.30.mlp.up_proj.weight": "model-00012-of-00030.safetensors",
228
+ "model.layers.30.post_attention_layernorm.weight": "model-00012-of-00030.safetensors",
229
+ "model.layers.30.self_attn.k_proj.weight": "model-00012-of-00030.safetensors",
230
+ "model.layers.30.self_attn.o_proj.weight": "model-00012-of-00030.safetensors",
231
+ "model.layers.30.self_attn.q_proj.weight": "model-00012-of-00030.safetensors",
232
+ "model.layers.30.self_attn.v_proj.weight": "model-00012-of-00030.safetensors",
233
+ "model.layers.31.input_layernorm.weight": "model-00012-of-00030.safetensors",
234
+ "model.layers.31.mlp.down_proj.weight": "model-00012-of-00030.safetensors",
235
+ "model.layers.31.mlp.gate_proj.weight": "model-00012-of-00030.safetensors",
236
+ "model.layers.31.mlp.up_proj.weight": "model-00012-of-00030.safetensors",
237
+ "model.layers.31.post_attention_layernorm.weight": "model-00012-of-00030.safetensors",
238
+ "model.layers.31.self_attn.k_proj.weight": "model-00012-of-00030.safetensors",
239
+ "model.layers.31.self_attn.o_proj.weight": "model-00012-of-00030.safetensors",
240
+ "model.layers.31.self_attn.q_proj.weight": "model-00012-of-00030.safetensors",
241
+ "model.layers.31.self_attn.v_proj.weight": "model-00012-of-00030.safetensors",
242
+ "model.layers.32.input_layernorm.weight": "model-00013-of-00030.safetensors",
243
+ "model.layers.32.mlp.down_proj.weight": "model-00013-of-00030.safetensors",
244
+ "model.layers.32.mlp.gate_proj.weight": "model-00013-of-00030.safetensors",
245
+ "model.layers.32.mlp.up_proj.weight": "model-00013-of-00030.safetensors",
246
+ "model.layers.32.post_attention_layernorm.weight": "model-00013-of-00030.safetensors",
247
+ "model.layers.32.self_attn.k_proj.weight": "model-00012-of-00030.safetensors",
248
+ "model.layers.32.self_attn.o_proj.weight": "model-00012-of-00030.safetensors",
249
+ "model.layers.32.self_attn.q_proj.weight": "model-00012-of-00030.safetensors",
250
+ "model.layers.32.self_attn.v_proj.weight": "model-00012-of-00030.safetensors",
251
+ "model.layers.33.input_layernorm.weight": "model-00013-of-00030.safetensors",
252
+ "model.layers.33.mlp.down_proj.weight": "model-00013-of-00030.safetensors",
253
+ "model.layers.33.mlp.gate_proj.weight": "model-00013-of-00030.safetensors",
254
+ "model.layers.33.mlp.up_proj.weight": "model-00013-of-00030.safetensors",
255
+ "model.layers.33.post_attention_layernorm.weight": "model-00013-of-00030.safetensors",
256
+ "model.layers.33.self_attn.k_proj.weight": "model-00013-of-00030.safetensors",
257
+ "model.layers.33.self_attn.o_proj.weight": "model-00013-of-00030.safetensors",
258
+ "model.layers.33.self_attn.q_proj.weight": "model-00013-of-00030.safetensors",
259
+ "model.layers.33.self_attn.v_proj.weight": "model-00013-of-00030.safetensors",
260
+ "model.layers.34.input_layernorm.weight": "model-00013-of-00030.safetensors",
261
+ "model.layers.34.mlp.down_proj.weight": "model-00013-of-00030.safetensors",
262
+ "model.layers.34.mlp.gate_proj.weight": "model-00013-of-00030.safetensors",
263
+ "model.layers.34.mlp.up_proj.weight": "model-00013-of-00030.safetensors",
264
+ "model.layers.34.post_attention_layernorm.weight": "model-00013-of-00030.safetensors",
265
+ "model.layers.34.self_attn.k_proj.weight": "model-00013-of-00030.safetensors",
266
+ "model.layers.34.self_attn.o_proj.weight": "model-00013-of-00030.safetensors",
267
+ "model.layers.34.self_attn.q_proj.weight": "model-00013-of-00030.safetensors",
268
+ "model.layers.34.self_attn.v_proj.weight": "model-00013-of-00030.safetensors",
269
+ "model.layers.35.input_layernorm.weight": "model-00014-of-00030.safetensors",
270
+ "model.layers.35.mlp.down_proj.weight": "model-00014-of-00030.safetensors",
271
+ "model.layers.35.mlp.gate_proj.weight": "model-00014-of-00030.safetensors",
272
+ "model.layers.35.mlp.up_proj.weight": "model-00014-of-00030.safetensors",
273
+ "model.layers.35.post_attention_layernorm.weight": "model-00014-of-00030.safetensors",
274
+ "model.layers.35.self_attn.k_proj.weight": "model-00013-of-00030.safetensors",
275
+ "model.layers.35.self_attn.o_proj.weight": "model-00014-of-00030.safetensors",
276
+ "model.layers.35.self_attn.q_proj.weight": "model-00013-of-00030.safetensors",
277
+ "model.layers.35.self_attn.v_proj.weight": "model-00013-of-00030.safetensors",
278
+ "model.layers.36.input_layernorm.weight": "model-00014-of-00030.safetensors",
279
+ "model.layers.36.mlp.down_proj.weight": "model-00014-of-00030.safetensors",
280
+ "model.layers.36.mlp.gate_proj.weight": "model-00014-of-00030.safetensors",
281
+ "model.layers.36.mlp.up_proj.weight": "model-00014-of-00030.safetensors",
282
+ "model.layers.36.post_attention_layernorm.weight": "model-00014-of-00030.safetensors",
283
+ "model.layers.36.self_attn.k_proj.weight": "model-00014-of-00030.safetensors",
284
+ "model.layers.36.self_attn.o_proj.weight": "model-00014-of-00030.safetensors",
285
+ "model.layers.36.self_attn.q_proj.weight": "model-00014-of-00030.safetensors",
286
+ "model.layers.36.self_attn.v_proj.weight": "model-00014-of-00030.safetensors",
287
+ "model.layers.37.input_layernorm.weight": "model-00014-of-00030.safetensors",
288
+ "model.layers.37.mlp.down_proj.weight": "model-00014-of-00030.safetensors",
289
+ "model.layers.37.mlp.gate_proj.weight": "model-00014-of-00030.safetensors",
290
+ "model.layers.37.mlp.up_proj.weight": "model-00014-of-00030.safetensors",
291
+ "model.layers.37.post_attention_layernorm.weight": "model-00014-of-00030.safetensors",
292
+ "model.layers.37.self_attn.k_proj.weight": "model-00014-of-00030.safetensors",
293
+ "model.layers.37.self_attn.o_proj.weight": "model-00014-of-00030.safetensors",
294
+ "model.layers.37.self_attn.q_proj.weight": "model-00014-of-00030.safetensors",
295
+ "model.layers.37.self_attn.v_proj.weight": "model-00014-of-00030.safetensors",
296
+ "model.layers.38.input_layernorm.weight": "model-00015-of-00030.safetensors",
297
+ "model.layers.38.mlp.down_proj.weight": "model-00015-of-00030.safetensors",
298
+ "model.layers.38.mlp.gate_proj.weight": "model-00015-of-00030.safetensors",
299
+ "model.layers.38.mlp.up_proj.weight": "model-00015-of-00030.safetensors",
300
+ "model.layers.38.post_attention_layernorm.weight": "model-00015-of-00030.safetensors",
301
+ "model.layers.38.self_attn.k_proj.weight": "model-00015-of-00030.safetensors",
302
+ "model.layers.38.self_attn.o_proj.weight": "model-00015-of-00030.safetensors",
303
+ "model.layers.38.self_attn.q_proj.weight": "model-00015-of-00030.safetensors",
304
+ "model.layers.38.self_attn.v_proj.weight": "model-00015-of-00030.safetensors",
305
+ "model.layers.39.input_layernorm.weight": "model-00015-of-00030.safetensors",
306
+ "model.layers.39.mlp.down_proj.weight": "model-00015-of-00030.safetensors",
307
+ "model.layers.39.mlp.gate_proj.weight": "model-00015-of-00030.safetensors",
308
+ "model.layers.39.mlp.up_proj.weight": "model-00015-of-00030.safetensors",
309
+ "model.layers.39.post_attention_layernorm.weight": "model-00015-of-00030.safetensors",
310
+ "model.layers.39.self_attn.k_proj.weight": "model-00015-of-00030.safetensors",
311
+ "model.layers.39.self_attn.o_proj.weight": "model-00015-of-00030.safetensors",
312
+ "model.layers.39.self_attn.q_proj.weight": "model-00015-of-00030.safetensors",
313
+ "model.layers.39.self_attn.v_proj.weight": "model-00015-of-00030.safetensors",
314
+ "model.layers.4.input_layernorm.weight": "model-00003-of-00030.safetensors",
315
+ "model.layers.4.mlp.down_proj.weight": "model-00003-of-00030.safetensors",
316
+ "model.layers.4.mlp.gate_proj.weight": "model-00003-of-00030.safetensors",
317
+ "model.layers.4.mlp.up_proj.weight": "model-00003-of-00030.safetensors",
318
+ "model.layers.4.post_attention_layernorm.weight": "model-00003-of-00030.safetensors",
319
+ "model.layers.4.self_attn.k_proj.weight": "model-00002-of-00030.safetensors",
320
+ "model.layers.4.self_attn.o_proj.weight": "model-00002-of-00030.safetensors",
321
+ "model.layers.4.self_attn.q_proj.weight": "model-00002-of-00030.safetensors",
322
+ "model.layers.4.self_attn.v_proj.weight": "model-00002-of-00030.safetensors",
323
+ "model.layers.40.input_layernorm.weight": "model-00016-of-00030.safetensors",
324
+ "model.layers.40.mlp.down_proj.weight": "model-00016-of-00030.safetensors",
325
+ "model.layers.40.mlp.gate_proj.weight": "model-00015-of-00030.safetensors",
326
+ "model.layers.40.mlp.up_proj.weight": "model-00015-of-00030.safetensors",
327
+ "model.layers.40.post_attention_layernorm.weight": "model-00016-of-00030.safetensors",
328
+ "model.layers.40.self_attn.k_proj.weight": "model-00015-of-00030.safetensors",
329
+ "model.layers.40.self_attn.o_proj.weight": "model-00015-of-00030.safetensors",
330
+ "model.layers.40.self_attn.q_proj.weight": "model-00015-of-00030.safetensors",
331
+ "model.layers.40.self_attn.v_proj.weight": "model-00015-of-00030.safetensors",
332
+ "model.layers.41.input_layernorm.weight": "model-00016-of-00030.safetensors",
333
+ "model.layers.41.mlp.down_proj.weight": "model-00016-of-00030.safetensors",
334
+ "model.layers.41.mlp.gate_proj.weight": "model-00016-of-00030.safetensors",
335
+ "model.layers.41.mlp.up_proj.weight": "model-00016-of-00030.safetensors",
336
+ "model.layers.41.post_attention_layernorm.weight": "model-00016-of-00030.safetensors",
337
+ "model.layers.41.self_attn.k_proj.weight": "model-00016-of-00030.safetensors",
338
+ "model.layers.41.self_attn.o_proj.weight": "model-00016-of-00030.safetensors",
339
+ "model.layers.41.self_attn.q_proj.weight": "model-00016-of-00030.safetensors",
340
+ "model.layers.41.self_attn.v_proj.weight": "model-00016-of-00030.safetensors",
341
+ "model.layers.42.input_layernorm.weight": "model-00016-of-00030.safetensors",
342
+ "model.layers.42.mlp.down_proj.weight": "model-00016-of-00030.safetensors",
343
+ "model.layers.42.mlp.gate_proj.weight": "model-00016-of-00030.safetensors",
344
+ "model.layers.42.mlp.up_proj.weight": "model-00016-of-00030.safetensors",
345
+ "model.layers.42.post_attention_layernorm.weight": "model-00016-of-00030.safetensors",
346
+ "model.layers.42.self_attn.k_proj.weight": "model-00016-of-00030.safetensors",
347
+ "model.layers.42.self_attn.o_proj.weight": "model-00016-of-00030.safetensors",
348
+ "model.layers.42.self_attn.q_proj.weight": "model-00016-of-00030.safetensors",
349
+ "model.layers.42.self_attn.v_proj.weight": "model-00016-of-00030.safetensors",
350
+ "model.layers.43.input_layernorm.weight": "model-00017-of-00030.safetensors",
351
+ "model.layers.43.mlp.down_proj.weight": "model-00017-of-00030.safetensors",
352
+ "model.layers.43.mlp.gate_proj.weight": "model-00016-of-00030.safetensors",
353
+ "model.layers.43.mlp.up_proj.weight": "model-00017-of-00030.safetensors",
354
+ "model.layers.43.post_attention_layernorm.weight": "model-00017-of-00030.safetensors",
355
+ "model.layers.43.self_attn.k_proj.weight": "model-00016-of-00030.safetensors",
356
+ "model.layers.43.self_attn.o_proj.weight": "model-00016-of-00030.safetensors",
357
+ "model.layers.43.self_attn.q_proj.weight": "model-00016-of-00030.safetensors",
358
+ "model.layers.43.self_attn.v_proj.weight": "model-00016-of-00030.safetensors",
359
+ "model.layers.44.input_layernorm.weight": "model-00017-of-00030.safetensors",
360
+ "model.layers.44.mlp.down_proj.weight": "model-00017-of-00030.safetensors",
361
+ "model.layers.44.mlp.gate_proj.weight": "model-00017-of-00030.safetensors",
362
+ "model.layers.44.mlp.up_proj.weight": "model-00017-of-00030.safetensors",
363
+ "model.layers.44.post_attention_layernorm.weight": "model-00017-of-00030.safetensors",
364
+ "model.layers.44.self_attn.k_proj.weight": "model-00017-of-00030.safetensors",
365
+ "model.layers.44.self_attn.o_proj.weight": "model-00017-of-00030.safetensors",
366
+ "model.layers.44.self_attn.q_proj.weight": "model-00017-of-00030.safetensors",
367
+ "model.layers.44.self_attn.v_proj.weight": "model-00017-of-00030.safetensors",
368
+ "model.layers.45.input_layernorm.weight": "model-00017-of-00030.safetensors",
369
+ "model.layers.45.mlp.down_proj.weight": "model-00017-of-00030.safetensors",
370
+ "model.layers.45.mlp.gate_proj.weight": "model-00017-of-00030.safetensors",
371
+ "model.layers.45.mlp.up_proj.weight": "model-00017-of-00030.safetensors",
372
+ "model.layers.45.post_attention_layernorm.weight": "model-00017-of-00030.safetensors",
373
+ "model.layers.45.self_attn.k_proj.weight": "model-00017-of-00030.safetensors",
374
+ "model.layers.45.self_attn.o_proj.weight": "model-00017-of-00030.safetensors",
375
+ "model.layers.45.self_attn.q_proj.weight": "model-00017-of-00030.safetensors",
376
+ "model.layers.45.self_attn.v_proj.weight": "model-00017-of-00030.safetensors",
377
+ "model.layers.46.input_layernorm.weight": "model-00018-of-00030.safetensors",
378
+ "model.layers.46.mlp.down_proj.weight": "model-00018-of-00030.safetensors",
379
+ "model.layers.46.mlp.gate_proj.weight": "model-00018-of-00030.safetensors",
380
+ "model.layers.46.mlp.up_proj.weight": "model-00018-of-00030.safetensors",
381
+ "model.layers.46.post_attention_layernorm.weight": "model-00018-of-00030.safetensors",
382
+ "model.layers.46.self_attn.k_proj.weight": "model-00017-of-00030.safetensors",
383
+ "model.layers.46.self_attn.o_proj.weight": "model-00017-of-00030.safetensors",
384
+ "model.layers.46.self_attn.q_proj.weight": "model-00017-of-00030.safetensors",
385
+ "model.layers.46.self_attn.v_proj.weight": "model-00017-of-00030.safetensors",
386
+ "model.layers.47.input_layernorm.weight": "model-00018-of-00030.safetensors",
387
+ "model.layers.47.mlp.down_proj.weight": "model-00018-of-00030.safetensors",
388
+ "model.layers.47.mlp.gate_proj.weight": "model-00018-of-00030.safetensors",
389
+ "model.layers.47.mlp.up_proj.weight": "model-00018-of-00030.safetensors",
390
+ "model.layers.47.post_attention_layernorm.weight": "model-00018-of-00030.safetensors",
391
+ "model.layers.47.self_attn.k_proj.weight": "model-00018-of-00030.safetensors",
392
+ "model.layers.47.self_attn.o_proj.weight": "model-00018-of-00030.safetensors",
393
+ "model.layers.47.self_attn.q_proj.weight": "model-00018-of-00030.safetensors",
394
+ "model.layers.47.self_attn.v_proj.weight": "model-00018-of-00030.safetensors",
395
+ "model.layers.48.input_layernorm.weight": "model-00018-of-00030.safetensors",
396
+ "model.layers.48.mlp.down_proj.weight": "model-00018-of-00030.safetensors",
397
+ "model.layers.48.mlp.gate_proj.weight": "model-00018-of-00030.safetensors",
398
+ "model.layers.48.mlp.up_proj.weight": "model-00018-of-00030.safetensors",
399
+ "model.layers.48.post_attention_layernorm.weight": "model-00018-of-00030.safetensors",
400
+ "model.layers.48.self_attn.k_proj.weight": "model-00018-of-00030.safetensors",
401
+ "model.layers.48.self_attn.o_proj.weight": "model-00018-of-00030.safetensors",
402
+ "model.layers.48.self_attn.q_proj.weight": "model-00018-of-00030.safetensors",
403
+ "model.layers.48.self_attn.v_proj.weight": "model-00018-of-00030.safetensors",
404
+ "model.layers.49.input_layernorm.weight": "model-00019-of-00030.safetensors",
405
+ "model.layers.49.mlp.down_proj.weight": "model-00019-of-00030.safetensors",
406
+ "model.layers.49.mlp.gate_proj.weight": "model-00019-of-00030.safetensors",
407
+ "model.layers.49.mlp.up_proj.weight": "model-00019-of-00030.safetensors",
408
+ "model.layers.49.post_attention_layernorm.weight": "model-00019-of-00030.safetensors",
409
+ "model.layers.49.self_attn.k_proj.weight": "model-00018-of-00030.safetensors",
410
+ "model.layers.49.self_attn.o_proj.weight": "model-00019-of-00030.safetensors",
411
+ "model.layers.49.self_attn.q_proj.weight": "model-00018-of-00030.safetensors",
412
+ "model.layers.49.self_attn.v_proj.weight": "model-00018-of-00030.safetensors",
413
+ "model.layers.5.input_layernorm.weight": "model-00003-of-00030.safetensors",
414
+ "model.layers.5.mlp.down_proj.weight": "model-00003-of-00030.safetensors",
415
+ "model.layers.5.mlp.gate_proj.weight": "model-00003-of-00030.safetensors",
416
+ "model.layers.5.mlp.up_proj.weight": "model-00003-of-00030.safetensors",
417
+ "model.layers.5.post_attention_layernorm.weight": "model-00003-of-00030.safetensors",
418
+ "model.layers.5.self_attn.k_proj.weight": "model-00003-of-00030.safetensors",
419
+ "model.layers.5.self_attn.o_proj.weight": "model-00003-of-00030.safetensors",
420
+ "model.layers.5.self_attn.q_proj.weight": "model-00003-of-00030.safetensors",
421
+ "model.layers.5.self_attn.v_proj.weight": "model-00003-of-00030.safetensors",
422
+ "model.layers.50.input_layernorm.weight": "model-00019-of-00030.safetensors",
423
+ "model.layers.50.mlp.down_proj.weight": "model-00019-of-00030.safetensors",
424
+ "model.layers.50.mlp.gate_proj.weight": "model-00019-of-00030.safetensors",
425
+ "model.layers.50.mlp.up_proj.weight": "model-00019-of-00030.safetensors",
426
+ "model.layers.50.post_attention_layernorm.weight": "model-00019-of-00030.safetensors",
427
+ "model.layers.50.self_attn.k_proj.weight": "model-00019-of-00030.safetensors",
428
+ "model.layers.50.self_attn.o_proj.weight": "model-00019-of-00030.safetensors",
429
+ "model.layers.50.self_attn.q_proj.weight": "model-00019-of-00030.safetensors",
430
+ "model.layers.50.self_attn.v_proj.weight": "model-00019-of-00030.safetensors",
431
+ "model.layers.51.input_layernorm.weight": "model-00019-of-00030.safetensors",
432
+ "model.layers.51.mlp.down_proj.weight": "model-00019-of-00030.safetensors",
433
+ "model.layers.51.mlp.gate_proj.weight": "model-00019-of-00030.safetensors",
434
+ "model.layers.51.mlp.up_proj.weight": "model-00019-of-00030.safetensors",
435
+ "model.layers.51.post_attention_layernorm.weight": "model-00019-of-00030.safetensors",
436
+ "model.layers.51.self_attn.k_proj.weight": "model-00019-of-00030.safetensors",
437
+ "model.layers.51.self_attn.o_proj.weight": "model-00019-of-00030.safetensors",
438
+ "model.layers.51.self_attn.q_proj.weight": "model-00019-of-00030.safetensors",
439
+ "model.layers.51.self_attn.v_proj.weight": "model-00019-of-00030.safetensors",
440
+ "model.layers.52.input_layernorm.weight": "model-00020-of-00030.safetensors",
441
+ "model.layers.52.mlp.down_proj.weight": "model-00020-of-00030.safetensors",
442
+ "model.layers.52.mlp.gate_proj.weight": "model-00020-of-00030.safetensors",
443
+ "model.layers.52.mlp.up_proj.weight": "model-00020-of-00030.safetensors",
444
+ "model.layers.52.post_attention_layernorm.weight": "model-00020-of-00030.safetensors",
445
+ "model.layers.52.self_attn.k_proj.weight": "model-00020-of-00030.safetensors",
446
+ "model.layers.52.self_attn.o_proj.weight": "model-00020-of-00030.safetensors",
447
+ "model.layers.52.self_attn.q_proj.weight": "model-00020-of-00030.safetensors",
448
+ "model.layers.52.self_attn.v_proj.weight": "model-00020-of-00030.safetensors",
449
+ "model.layers.53.input_layernorm.weight": "model-00020-of-00030.safetensors",
450
+ "model.layers.53.mlp.down_proj.weight": "model-00020-of-00030.safetensors",
451
+ "model.layers.53.mlp.gate_proj.weight": "model-00020-of-00030.safetensors",
452
+ "model.layers.53.mlp.up_proj.weight": "model-00020-of-00030.safetensors",
453
+ "model.layers.53.post_attention_layernorm.weight": "model-00020-of-00030.safetensors",
454
+ "model.layers.53.self_attn.k_proj.weight": "model-00020-of-00030.safetensors",
455
+ "model.layers.53.self_attn.o_proj.weight": "model-00020-of-00030.safetensors",
456
+ "model.layers.53.self_attn.q_proj.weight": "model-00020-of-00030.safetensors",
457
+ "model.layers.53.self_attn.v_proj.weight": "model-00020-of-00030.safetensors",
458
+ "model.layers.54.input_layernorm.weight": "model-00021-of-00030.safetensors",
459
+ "model.layers.54.mlp.down_proj.weight": "model-00021-of-00030.safetensors",
460
+ "model.layers.54.mlp.gate_proj.weight": "model-00020-of-00030.safetensors",
461
+ "model.layers.54.mlp.up_proj.weight": "model-00020-of-00030.safetensors",
462
+ "model.layers.54.post_attention_layernorm.weight": "model-00021-of-00030.safetensors",
463
+ "model.layers.54.self_attn.k_proj.weight": "model-00020-of-00030.safetensors",
464
+ "model.layers.54.self_attn.o_proj.weight": "model-00020-of-00030.safetensors",
465
+ "model.layers.54.self_attn.q_proj.weight": "model-00020-of-00030.safetensors",
466
+ "model.layers.54.self_attn.v_proj.weight": "model-00020-of-00030.safetensors",
467
+ "model.layers.55.input_layernorm.weight": "model-00021-of-00030.safetensors",
468
+ "model.layers.55.mlp.down_proj.weight": "model-00021-of-00030.safetensors",
469
+ "model.layers.55.mlp.gate_proj.weight": "model-00021-of-00030.safetensors",
470
+ "model.layers.55.mlp.up_proj.weight": "model-00021-of-00030.safetensors",
471
+ "model.layers.55.post_attention_layernorm.weight": "model-00021-of-00030.safetensors",
472
+ "model.layers.55.self_attn.k_proj.weight": "model-00021-of-00030.safetensors",
473
+ "model.layers.55.self_attn.o_proj.weight": "model-00021-of-00030.safetensors",
474
+ "model.layers.55.self_attn.q_proj.weight": "model-00021-of-00030.safetensors",
475
+ "model.layers.55.self_attn.v_proj.weight": "model-00021-of-00030.safetensors",
476
+ "model.layers.56.input_layernorm.weight": "model-00021-of-00030.safetensors",
477
+ "model.layers.56.mlp.down_proj.weight": "model-00021-of-00030.safetensors",
478
+ "model.layers.56.mlp.gate_proj.weight": "model-00021-of-00030.safetensors",
479
+ "model.layers.56.mlp.up_proj.weight": "model-00021-of-00030.safetensors",
480
+ "model.layers.56.post_attention_layernorm.weight": "model-00021-of-00030.safetensors",
481
+ "model.layers.56.self_attn.k_proj.weight": "model-00021-of-00030.safetensors",
482
+ "model.layers.56.self_attn.o_proj.weight": "model-00021-of-00030.safetensors",
483
+ "model.layers.56.self_attn.q_proj.weight": "model-00021-of-00030.safetensors",
484
+ "model.layers.56.self_attn.v_proj.weight": "model-00021-of-00030.safetensors",
485
+ "model.layers.57.input_layernorm.weight": "model-00022-of-00030.safetensors",
486
+ "model.layers.57.mlp.down_proj.weight": "model-00022-of-00030.safetensors",
487
+ "model.layers.57.mlp.gate_proj.weight": "model-00021-of-00030.safetensors",
488
+ "model.layers.57.mlp.up_proj.weight": "model-00022-of-00030.safetensors",
489
+ "model.layers.57.post_attention_layernorm.weight": "model-00022-of-00030.safetensors",
490
+ "model.layers.57.self_attn.k_proj.weight": "model-00021-of-00030.safetensors",
491
+ "model.layers.57.self_attn.o_proj.weight": "model-00021-of-00030.safetensors",
492
+ "model.layers.57.self_attn.q_proj.weight": "model-00021-of-00030.safetensors",
493
+ "model.layers.57.self_attn.v_proj.weight": "model-00021-of-00030.safetensors",
494
+ "model.layers.58.input_layernorm.weight": "model-00022-of-00030.safetensors",
495
+ "model.layers.58.mlp.down_proj.weight": "model-00022-of-00030.safetensors",
496
+ "model.layers.58.mlp.gate_proj.weight": "model-00022-of-00030.safetensors",
497
+ "model.layers.58.mlp.up_proj.weight": "model-00022-of-00030.safetensors",
498
+ "model.layers.58.post_attention_layernorm.weight": "model-00022-of-00030.safetensors",
499
+ "model.layers.58.self_attn.k_proj.weight": "model-00022-of-00030.safetensors",
500
+ "model.layers.58.self_attn.o_proj.weight": "model-00022-of-00030.safetensors",
501
+ "model.layers.58.self_attn.q_proj.weight": "model-00022-of-00030.safetensors",
502
+ "model.layers.58.self_attn.v_proj.weight": "model-00022-of-00030.safetensors",
503
+ "model.layers.59.input_layernorm.weight": "model-00022-of-00030.safetensors",
504
+ "model.layers.59.mlp.down_proj.weight": "model-00022-of-00030.safetensors",
505
+ "model.layers.59.mlp.gate_proj.weight": "model-00022-of-00030.safetensors",
506
+ "model.layers.59.mlp.up_proj.weight": "model-00022-of-00030.safetensors",
507
+ "model.layers.59.post_attention_layernorm.weight": "model-00022-of-00030.safetensors",
508
+ "model.layers.59.self_attn.k_proj.weight": "model-00022-of-00030.safetensors",
509
+ "model.layers.59.self_attn.o_proj.weight": "model-00022-of-00030.safetensors",
510
+ "model.layers.59.self_attn.q_proj.weight": "model-00022-of-00030.safetensors",
511
+ "model.layers.59.self_attn.v_proj.weight": "model-00022-of-00030.safetensors",
512
+ "model.layers.6.input_layernorm.weight": "model-00003-of-00030.safetensors",
513
+ "model.layers.6.mlp.down_proj.weight": "model-00003-of-00030.safetensors",
514
+ "model.layers.6.mlp.gate_proj.weight": "model-00003-of-00030.safetensors",
515
+ "model.layers.6.mlp.up_proj.weight": "model-00003-of-00030.safetensors",
516
+ "model.layers.6.post_attention_layernorm.weight": "model-00003-of-00030.safetensors",
517
+ "model.layers.6.self_attn.k_proj.weight": "model-00003-of-00030.safetensors",
518
+ "model.layers.6.self_attn.o_proj.weight": "model-00003-of-00030.safetensors",
519
+ "model.layers.6.self_attn.q_proj.weight": "model-00003-of-00030.safetensors",
520
+ "model.layers.6.self_attn.v_proj.weight": "model-00003-of-00030.safetensors",
521
+ "model.layers.60.input_layernorm.weight": "model-00023-of-00030.safetensors",
522
+ "model.layers.60.mlp.down_proj.weight": "model-00023-of-00030.safetensors",
523
+ "model.layers.60.mlp.gate_proj.weight": "model-00023-of-00030.safetensors",
524
+ "model.layers.60.mlp.up_proj.weight": "model-00023-of-00030.safetensors",
525
+ "model.layers.60.post_attention_layernorm.weight": "model-00023-of-00030.safetensors",
526
+ "model.layers.60.self_attn.k_proj.weight": "model-00022-of-00030.safetensors",
527
+ "model.layers.60.self_attn.o_proj.weight": "model-00022-of-00030.safetensors",
528
+ "model.layers.60.self_attn.q_proj.weight": "model-00022-of-00030.safetensors",
529
+ "model.layers.60.self_attn.v_proj.weight": "model-00022-of-00030.safetensors",
530
+ "model.layers.61.input_layernorm.weight": "model-00023-of-00030.safetensors",
531
+ "model.layers.61.mlp.down_proj.weight": "model-00023-of-00030.safetensors",
532
+ "model.layers.61.mlp.gate_proj.weight": "model-00023-of-00030.safetensors",
533
+ "model.layers.61.mlp.up_proj.weight": "model-00023-of-00030.safetensors",
534
+ "model.layers.61.post_attention_layernorm.weight": "model-00023-of-00030.safetensors",
535
+ "model.layers.61.self_attn.k_proj.weight": "model-00023-of-00030.safetensors",
536
+ "model.layers.61.self_attn.o_proj.weight": "model-00023-of-00030.safetensors",
537
+ "model.layers.61.self_attn.q_proj.weight": "model-00023-of-00030.safetensors",
538
+ "model.layers.61.self_attn.v_proj.weight": "model-00023-of-00030.safetensors",
539
+ "model.layers.62.input_layernorm.weight": "model-00023-of-00030.safetensors",
540
+ "model.layers.62.mlp.down_proj.weight": "model-00023-of-00030.safetensors",
541
+ "model.layers.62.mlp.gate_proj.weight": "model-00023-of-00030.safetensors",
542
+ "model.layers.62.mlp.up_proj.weight": "model-00023-of-00030.safetensors",
543
+ "model.layers.62.post_attention_layernorm.weight": "model-00023-of-00030.safetensors",
544
+ "model.layers.62.self_attn.k_proj.weight": "model-00023-of-00030.safetensors",
545
+ "model.layers.62.self_attn.o_proj.weight": "model-00023-of-00030.safetensors",
546
+ "model.layers.62.self_attn.q_proj.weight": "model-00023-of-00030.safetensors",
547
+ "model.layers.62.self_attn.v_proj.weight": "model-00023-of-00030.safetensors",
548
+ "model.layers.63.input_layernorm.weight": "model-00024-of-00030.safetensors",
549
+ "model.layers.63.mlp.down_proj.weight": "model-00024-of-00030.safetensors",
550
+ "model.layers.63.mlp.gate_proj.weight": "model-00024-of-00030.safetensors",
551
+ "model.layers.63.mlp.up_proj.weight": "model-00024-of-00030.safetensors",
552
+ "model.layers.63.post_attention_layernorm.weight": "model-00024-of-00030.safetensors",
553
+ "model.layers.63.self_attn.k_proj.weight": "model-00023-of-00030.safetensors",
554
+ "model.layers.63.self_attn.o_proj.weight": "model-00024-of-00030.safetensors",
555
+ "model.layers.63.self_attn.q_proj.weight": "model-00023-of-00030.safetensors",
556
+ "model.layers.63.self_attn.v_proj.weight": "model-00023-of-00030.safetensors",
557
+ "model.layers.64.input_layernorm.weight": "model-00024-of-00030.safetensors",
558
+ "model.layers.64.mlp.down_proj.weight": "model-00024-of-00030.safetensors",
559
+ "model.layers.64.mlp.gate_proj.weight": "model-00024-of-00030.safetensors",
560
+ "model.layers.64.mlp.up_proj.weight": "model-00024-of-00030.safetensors",
561
+ "model.layers.64.post_attention_layernorm.weight": "model-00024-of-00030.safetensors",
562
+ "model.layers.64.self_attn.k_proj.weight": "model-00024-of-00030.safetensors",
563
+ "model.layers.64.self_attn.o_proj.weight": "model-00024-of-00030.safetensors",
564
+ "model.layers.64.self_attn.q_proj.weight": "model-00024-of-00030.safetensors",
565
+ "model.layers.64.self_attn.v_proj.weight": "model-00024-of-00030.safetensors",
566
+ "model.layers.65.input_layernorm.weight": "model-00024-of-00030.safetensors",
567
+ "model.layers.65.mlp.down_proj.weight": "model-00024-of-00030.safetensors",
568
+ "model.layers.65.mlp.gate_proj.weight": "model-00024-of-00030.safetensors",
569
+ "model.layers.65.mlp.up_proj.weight": "model-00024-of-00030.safetensors",
570
+ "model.layers.65.post_attention_layernorm.weight": "model-00024-of-00030.safetensors",
571
+ "model.layers.65.self_attn.k_proj.weight": "model-00024-of-00030.safetensors",
572
+ "model.layers.65.self_attn.o_proj.weight": "model-00024-of-00030.safetensors",
573
+ "model.layers.65.self_attn.q_proj.weight": "model-00024-of-00030.safetensors",
574
+ "model.layers.65.self_attn.v_proj.weight": "model-00024-of-00030.safetensors",
575
+ "model.layers.66.input_layernorm.weight": "model-00025-of-00030.safetensors",
576
+ "model.layers.66.mlp.down_proj.weight": "model-00025-of-00030.safetensors",
577
+ "model.layers.66.mlp.gate_proj.weight": "model-00025-of-00030.safetensors",
578
+ "model.layers.66.mlp.up_proj.weight": "model-00025-of-00030.safetensors",
579
+ "model.layers.66.post_attention_layernorm.weight": "model-00025-of-00030.safetensors",
580
+ "model.layers.66.self_attn.k_proj.weight": "model-00025-of-00030.safetensors",
581
+ "model.layers.66.self_attn.o_proj.weight": "model-00025-of-00030.safetensors",
582
+ "model.layers.66.self_attn.q_proj.weight": "model-00025-of-00030.safetensors",
583
+ "model.layers.66.self_attn.v_proj.weight": "model-00025-of-00030.safetensors",
584
+ "model.layers.67.input_layernorm.weight": "model-00025-of-00030.safetensors",
585
+ "model.layers.67.mlp.down_proj.weight": "model-00025-of-00030.safetensors",
586
+ "model.layers.67.mlp.gate_proj.weight": "model-00025-of-00030.safetensors",
587
+ "model.layers.67.mlp.up_proj.weight": "model-00025-of-00030.safetensors",
588
+ "model.layers.67.post_attention_layernorm.weight": "model-00025-of-00030.safetensors",
589
+ "model.layers.67.self_attn.k_proj.weight": "model-00025-of-00030.safetensors",
590
+ "model.layers.67.self_attn.o_proj.weight": "model-00025-of-00030.safetensors",
591
+ "model.layers.67.self_attn.q_proj.weight": "model-00025-of-00030.safetensors",
592
+ "model.layers.67.self_attn.v_proj.weight": "model-00025-of-00030.safetensors",
593
+ "model.layers.68.input_layernorm.weight": "model-00026-of-00030.safetensors",
594
+ "model.layers.68.mlp.down_proj.weight": "model-00026-of-00030.safetensors",
595
+ "model.layers.68.mlp.gate_proj.weight": "model-00025-of-00030.safetensors",
596
+ "model.layers.68.mlp.up_proj.weight": "model-00025-of-00030.safetensors",
597
+ "model.layers.68.post_attention_layernorm.weight": "model-00026-of-00030.safetensors",
598
+ "model.layers.68.self_attn.k_proj.weight": "model-00025-of-00030.safetensors",
599
+ "model.layers.68.self_attn.o_proj.weight": "model-00025-of-00030.safetensors",
600
+ "model.layers.68.self_attn.q_proj.weight": "model-00025-of-00030.safetensors",
601
+ "model.layers.68.self_attn.v_proj.weight": "model-00025-of-00030.safetensors",
602
+ "model.layers.69.input_layernorm.weight": "model-00026-of-00030.safetensors",
603
+ "model.layers.69.mlp.down_proj.weight": "model-00026-of-00030.safetensors",
604
+ "model.layers.69.mlp.gate_proj.weight": "model-00026-of-00030.safetensors",
605
+ "model.layers.69.mlp.up_proj.weight": "model-00026-of-00030.safetensors",
606
+ "model.layers.69.post_attention_layernorm.weight": "model-00026-of-00030.safetensors",
607
+ "model.layers.69.self_attn.k_proj.weight": "model-00026-of-00030.safetensors",
608
+ "model.layers.69.self_attn.o_proj.weight": "model-00026-of-00030.safetensors",
609
+ "model.layers.69.self_attn.q_proj.weight": "model-00026-of-00030.safetensors",
610
+ "model.layers.69.self_attn.v_proj.weight": "model-00026-of-00030.safetensors",
611
+ "model.layers.7.input_layernorm.weight": "model-00004-of-00030.safetensors",
612
+ "model.layers.7.mlp.down_proj.weight": "model-00004-of-00030.safetensors",
613
+ "model.layers.7.mlp.gate_proj.weight": "model-00004-of-00030.safetensors",
614
+ "model.layers.7.mlp.up_proj.weight": "model-00004-of-00030.safetensors",
615
+ "model.layers.7.post_attention_layernorm.weight": "model-00004-of-00030.safetensors",
616
+ "model.layers.7.self_attn.k_proj.weight": "model-00003-of-00030.safetensors",
617
+ "model.layers.7.self_attn.o_proj.weight": "model-00004-of-00030.safetensors",
618
+ "model.layers.7.self_attn.q_proj.weight": "model-00003-of-00030.safetensors",
619
+ "model.layers.7.self_attn.v_proj.weight": "model-00003-of-00030.safetensors",
620
+ "model.layers.70.input_layernorm.weight": "model-00026-of-00030.safetensors",
621
+ "model.layers.70.mlp.down_proj.weight": "model-00026-of-00030.safetensors",
622
+ "model.layers.70.mlp.gate_proj.weight": "model-00026-of-00030.safetensors",
623
+ "model.layers.70.mlp.up_proj.weight": "model-00026-of-00030.safetensors",
624
+ "model.layers.70.post_attention_layernorm.weight": "model-00026-of-00030.safetensors",
625
+ "model.layers.70.self_attn.k_proj.weight": "model-00026-of-00030.safetensors",
626
+ "model.layers.70.self_attn.o_proj.weight": "model-00026-of-00030.safetensors",
627
+ "model.layers.70.self_attn.q_proj.weight": "model-00026-of-00030.safetensors",
628
+ "model.layers.70.self_attn.v_proj.weight": "model-00026-of-00030.safetensors",
629
+ "model.layers.71.input_layernorm.weight": "model-00027-of-00030.safetensors",
630
+ "model.layers.71.mlp.down_proj.weight": "model-00027-of-00030.safetensors",
631
+ "model.layers.71.mlp.gate_proj.weight": "model-00026-of-00030.safetensors",
632
+ "model.layers.71.mlp.up_proj.weight": "model-00027-of-00030.safetensors",
633
+ "model.layers.71.post_attention_layernorm.weight": "model-00027-of-00030.safetensors",
634
+ "model.layers.71.self_attn.k_proj.weight": "model-00026-of-00030.safetensors",
635
+ "model.layers.71.self_attn.o_proj.weight": "model-00026-of-00030.safetensors",
636
+ "model.layers.71.self_attn.q_proj.weight": "model-00026-of-00030.safetensors",
637
+ "model.layers.71.self_attn.v_proj.weight": "model-00026-of-00030.safetensors",
638
+ "model.layers.72.input_layernorm.weight": "model-00027-of-00030.safetensors",
639
+ "model.layers.72.mlp.down_proj.weight": "model-00027-of-00030.safetensors",
640
+ "model.layers.72.mlp.gate_proj.weight": "model-00027-of-00030.safetensors",
641
+ "model.layers.72.mlp.up_proj.weight": "model-00027-of-00030.safetensors",
642
+ "model.layers.72.post_attention_layernorm.weight": "model-00027-of-00030.safetensors",
643
+ "model.layers.72.self_attn.k_proj.weight": "model-00027-of-00030.safetensors",
644
+ "model.layers.72.self_attn.o_proj.weight": "model-00027-of-00030.safetensors",
645
+ "model.layers.72.self_attn.q_proj.weight": "model-00027-of-00030.safetensors",
646
+ "model.layers.72.self_attn.v_proj.weight": "model-00027-of-00030.safetensors",
647
+ "model.layers.73.input_layernorm.weight": "model-00027-of-00030.safetensors",
648
+ "model.layers.73.mlp.down_proj.weight": "model-00027-of-00030.safetensors",
649
+ "model.layers.73.mlp.gate_proj.weight": "model-00027-of-00030.safetensors",
650
+ "model.layers.73.mlp.up_proj.weight": "model-00027-of-00030.safetensors",
651
+ "model.layers.73.post_attention_layernorm.weight": "model-00027-of-00030.safetensors",
652
+ "model.layers.73.self_attn.k_proj.weight": "model-00027-of-00030.safetensors",
653
+ "model.layers.73.self_attn.o_proj.weight": "model-00027-of-00030.safetensors",
654
+ "model.layers.73.self_attn.q_proj.weight": "model-00027-of-00030.safetensors",
655
+ "model.layers.73.self_attn.v_proj.weight": "model-00027-of-00030.safetensors",
656
+ "model.layers.74.input_layernorm.weight": "model-00028-of-00030.safetensors",
657
+ "model.layers.74.mlp.down_proj.weight": "model-00028-of-00030.safetensors",
658
+ "model.layers.74.mlp.gate_proj.weight": "model-00028-of-00030.safetensors",
659
+ "model.layers.74.mlp.up_proj.weight": "model-00028-of-00030.safetensors",
660
+ "model.layers.74.post_attention_layernorm.weight": "model-00028-of-00030.safetensors",
661
+ "model.layers.74.self_attn.k_proj.weight": "model-00027-of-00030.safetensors",
662
+ "model.layers.74.self_attn.o_proj.weight": "model-00027-of-00030.safetensors",
663
+ "model.layers.74.self_attn.q_proj.weight": "model-00027-of-00030.safetensors",
664
+ "model.layers.74.self_attn.v_proj.weight": "model-00027-of-00030.safetensors",
665
+ "model.layers.75.input_layernorm.weight": "model-00028-of-00030.safetensors",
666
+ "model.layers.75.mlp.down_proj.weight": "model-00028-of-00030.safetensors",
667
+ "model.layers.75.mlp.gate_proj.weight": "model-00028-of-00030.safetensors",
668
+ "model.layers.75.mlp.up_proj.weight": "model-00028-of-00030.safetensors",
669
+ "model.layers.75.post_attention_layernorm.weight": "model-00028-of-00030.safetensors",
670
+ "model.layers.75.self_attn.k_proj.weight": "model-00028-of-00030.safetensors",
671
+ "model.layers.75.self_attn.o_proj.weight": "model-00028-of-00030.safetensors",
672
+ "model.layers.75.self_attn.q_proj.weight": "model-00028-of-00030.safetensors",
673
+ "model.layers.75.self_attn.v_proj.weight": "model-00028-of-00030.safetensors",
674
+ "model.layers.76.input_layernorm.weight": "model-00028-of-00030.safetensors",
675
+ "model.layers.76.mlp.down_proj.weight": "model-00028-of-00030.safetensors",
676
+ "model.layers.76.mlp.gate_proj.weight": "model-00028-of-00030.safetensors",
677
+ "model.layers.76.mlp.up_proj.weight": "model-00028-of-00030.safetensors",
678
+ "model.layers.76.post_attention_layernorm.weight": "model-00028-of-00030.safetensors",
679
+ "model.layers.76.self_attn.k_proj.weight": "model-00028-of-00030.safetensors",
680
+ "model.layers.76.self_attn.o_proj.weight": "model-00028-of-00030.safetensors",
681
+ "model.layers.76.self_attn.q_proj.weight": "model-00028-of-00030.safetensors",
682
+ "model.layers.76.self_attn.v_proj.weight": "model-00028-of-00030.safetensors",
683
+ "model.layers.77.input_layernorm.weight": "model-00029-of-00030.safetensors",
684
+ "model.layers.77.mlp.down_proj.weight": "model-00029-of-00030.safetensors",
685
+ "model.layers.77.mlp.gate_proj.weight": "model-00029-of-00030.safetensors",
686
+ "model.layers.77.mlp.up_proj.weight": "model-00029-of-00030.safetensors",
687
+ "model.layers.77.post_attention_layernorm.weight": "model-00029-of-00030.safetensors",
688
+ "model.layers.77.self_attn.k_proj.weight": "model-00028-of-00030.safetensors",
689
+ "model.layers.77.self_attn.o_proj.weight": "model-00029-of-00030.safetensors",
690
+ "model.layers.77.self_attn.q_proj.weight": "model-00028-of-00030.safetensors",
691
+ "model.layers.77.self_attn.v_proj.weight": "model-00028-of-00030.safetensors",
692
+ "model.layers.78.input_layernorm.weight": "model-00029-of-00030.safetensors",
693
+ "model.layers.78.mlp.down_proj.weight": "model-00029-of-00030.safetensors",
694
+ "model.layers.78.mlp.gate_proj.weight": "model-00029-of-00030.safetensors",
695
+ "model.layers.78.mlp.up_proj.weight": "model-00029-of-00030.safetensors",
696
+ "model.layers.78.post_attention_layernorm.weight": "model-00029-of-00030.safetensors",
697
+ "model.layers.78.self_attn.k_proj.weight": "model-00029-of-00030.safetensors",
698
+ "model.layers.78.self_attn.o_proj.weight": "model-00029-of-00030.safetensors",
699
+ "model.layers.78.self_attn.q_proj.weight": "model-00029-of-00030.safetensors",
700
+ "model.layers.78.self_attn.v_proj.weight": "model-00029-of-00030.safetensors",
701
+ "model.layers.79.input_layernorm.weight": "model-00029-of-00030.safetensors",
702
+ "model.layers.79.mlp.down_proj.weight": "model-00029-of-00030.safetensors",
703
+ "model.layers.79.mlp.gate_proj.weight": "model-00029-of-00030.safetensors",
704
+ "model.layers.79.mlp.up_proj.weight": "model-00029-of-00030.safetensors",
705
+ "model.layers.79.post_attention_layernorm.weight": "model-00029-of-00030.safetensors",
706
+ "model.layers.79.self_attn.k_proj.weight": "model-00029-of-00030.safetensors",
707
+ "model.layers.79.self_attn.o_proj.weight": "model-00029-of-00030.safetensors",
708
+ "model.layers.79.self_attn.q_proj.weight": "model-00029-of-00030.safetensors",
709
+ "model.layers.79.self_attn.v_proj.weight": "model-00029-of-00030.safetensors",
710
+ "model.layers.8.input_layernorm.weight": "model-00004-of-00030.safetensors",
711
+ "model.layers.8.mlp.down_proj.weight": "model-00004-of-00030.safetensors",
712
+ "model.layers.8.mlp.gate_proj.weight": "model-00004-of-00030.safetensors",
713
+ "model.layers.8.mlp.up_proj.weight": "model-00004-of-00030.safetensors",
714
+ "model.layers.8.post_attention_layernorm.weight": "model-00004-of-00030.safetensors",
715
+ "model.layers.8.self_attn.k_proj.weight": "model-00004-of-00030.safetensors",
716
+ "model.layers.8.self_attn.o_proj.weight": "model-00004-of-00030.safetensors",
717
+ "model.layers.8.self_attn.q_proj.weight": "model-00004-of-00030.safetensors",
718
+ "model.layers.8.self_attn.v_proj.weight": "model-00004-of-00030.safetensors",
719
+ "model.layers.9.input_layernorm.weight": "model-00004-of-00030.safetensors",
720
+ "model.layers.9.mlp.down_proj.weight": "model-00004-of-00030.safetensors",
721
+ "model.layers.9.mlp.gate_proj.weight": "model-00004-of-00030.safetensors",
722
+ "model.layers.9.mlp.up_proj.weight": "model-00004-of-00030.safetensors",
723
+ "model.layers.9.post_attention_layernorm.weight": "model-00004-of-00030.safetensors",
724
+ "model.layers.9.self_attn.k_proj.weight": "model-00004-of-00030.safetensors",
725
+ "model.layers.9.self_attn.o_proj.weight": "model-00004-of-00030.safetensors",
726
+ "model.layers.9.self_attn.q_proj.weight": "model-00004-of-00030.safetensors",
727
+ "model.layers.9.self_attn.v_proj.weight": "model-00004-of-00030.safetensors",
728
+ "model.norm.weight": "model-00029-of-00030.safetensors"
729
+ }
730
+ }
output-00001-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:afd7edb6cb2d7844c205cb4d430f6d1168ef0725275de16905bcc88644e8d3e8
3
+ size 8500086684
output-00002-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f65cdc5584a1c9a4f15fe2230e4b66b36c524c0cb44d3779a18558e2510625bd
3
+ size 8501961824
output-00003-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3cb3ec11b0833a9d84cbf28195b99d98c02991c97f5d6f7fa340d418cb6b85f3
3
+ size 8589163668
output-00004-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:78c1daa15894a387ae1345377769d5b08c4f7e47f4744a0250540215585e0266
3
+ size 8516937596
output-00005-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e171ed3d66912d2a4b0b5b0f2dd25f84fa2e40a620053c0cfe8fe4fd16401d48
3
+ size 8486425784
output-00006-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6221c36a7e597e5bd59ba0cf042db2340034da58ea826c5161d4c147824fbb7a
3
+ size 3123802000
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|eot_id|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|finetune_right_pad_id|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,2062 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "128000": {
4
+ "content": "<|begin_of_text|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "128001": {
12
+ "content": "<|end_of_text|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "128002": {
20
+ "content": "<|reserved_special_token_0|>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "128003": {
28
+ "content": "<|reserved_special_token_1|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "128004": {
36
+ "content": "<|finetune_right_pad_id|>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "128005": {
44
+ "content": "<|reserved_special_token_2|>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "128006": {
52
+ "content": "<|start_header_id|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "128007": {
60
+ "content": "<|end_header_id|>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "128008": {
68
+ "content": "<|eom_id|>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "128009": {
76
+ "content": "<|eot_id|>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "128010": {
84
+ "content": "<|python_tag|>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "128011": {
92
+ "content": "<|reserved_special_token_3|>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "128012": {
100
+ "content": "<|reserved_special_token_4|>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "128013": {
108
+ "content": "<|reserved_special_token_5|>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "128014": {
116
+ "content": "<|reserved_special_token_6|>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "128015": {
124
+ "content": "<|reserved_special_token_7|>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "128016": {
132
+ "content": "<|reserved_special_token_8|>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "128017": {
140
+ "content": "<|reserved_special_token_9|>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "128018": {
148
+ "content": "<|reserved_special_token_10|>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "128019": {
156
+ "content": "<|reserved_special_token_11|>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "128020": {
164
+ "content": "<|reserved_special_token_12|>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "128021": {
172
+ "content": "<|reserved_special_token_13|>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "128022": {
180
+ "content": "<|reserved_special_token_14|>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "128023": {
188
+ "content": "<|reserved_special_token_15|>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "128024": {
196
+ "content": "<|reserved_special_token_16|>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "128025": {
204
+ "content": "<|reserved_special_token_17|>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "128026": {
212
+ "content": "<|reserved_special_token_18|>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "128027": {
220
+ "content": "<|reserved_special_token_19|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "128028": {
228
+ "content": "<|reserved_special_token_20|>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "128029": {
236
+ "content": "<|reserved_special_token_21|>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "128030": {
244
+ "content": "<|reserved_special_token_22|>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "128031": {
252
+ "content": "<|reserved_special_token_23|>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "128032": {
260
+ "content": "<|reserved_special_token_24|>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "128033": {
268
+ "content": "<|reserved_special_token_25|>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "128034": {
276
+ "content": "<|reserved_special_token_26|>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "128035": {
284
+ "content": "<|reserved_special_token_27|>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "128036": {
292
+ "content": "<|reserved_special_token_28|>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "128037": {
300
+ "content": "<|reserved_special_token_29|>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "128038": {
308
+ "content": "<|reserved_special_token_30|>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "128039": {
316
+ "content": "<|reserved_special_token_31|>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "128040": {
324
+ "content": "<|reserved_special_token_32|>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "128041": {
332
+ "content": "<|reserved_special_token_33|>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "128042": {
340
+ "content": "<|reserved_special_token_34|>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "128043": {
348
+ "content": "<|reserved_special_token_35|>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "128044": {
356
+ "content": "<|reserved_special_token_36|>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "128045": {
364
+ "content": "<|reserved_special_token_37|>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "128046": {
372
+ "content": "<|reserved_special_token_38|>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "128047": {
380
+ "content": "<|reserved_special_token_39|>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "128048": {
388
+ "content": "<|reserved_special_token_40|>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "128049": {
396
+ "content": "<|reserved_special_token_41|>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "128050": {
404
+ "content": "<|reserved_special_token_42|>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "128051": {
412
+ "content": "<|reserved_special_token_43|>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "128052": {
420
+ "content": "<|reserved_special_token_44|>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "128053": {
428
+ "content": "<|reserved_special_token_45|>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "128054": {
436
+ "content": "<|reserved_special_token_46|>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "128055": {
444
+ "content": "<|reserved_special_token_47|>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "128056": {
452
+ "content": "<|reserved_special_token_48|>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "128057": {
460
+ "content": "<|reserved_special_token_49|>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "128058": {
468
+ "content": "<|reserved_special_token_50|>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "128059": {
476
+ "content": "<|reserved_special_token_51|>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "128060": {
484
+ "content": "<|reserved_special_token_52|>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "128061": {
492
+ "content": "<|reserved_special_token_53|>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "128062": {
500
+ "content": "<|reserved_special_token_54|>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "128063": {
508
+ "content": "<|reserved_special_token_55|>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "128064": {
516
+ "content": "<|reserved_special_token_56|>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "128065": {
524
+ "content": "<|reserved_special_token_57|>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "128066": {
532
+ "content": "<|reserved_special_token_58|>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "128067": {
540
+ "content": "<|reserved_special_token_59|>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "128068": {
548
+ "content": "<|reserved_special_token_60|>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "128069": {
556
+ "content": "<|reserved_special_token_61|>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "128070": {
564
+ "content": "<|reserved_special_token_62|>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "128071": {
572
+ "content": "<|reserved_special_token_63|>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "128072": {
580
+ "content": "<|reserved_special_token_64|>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "128073": {
588
+ "content": "<|reserved_special_token_65|>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "128074": {
596
+ "content": "<|reserved_special_token_66|>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "128075": {
604
+ "content": "<|reserved_special_token_67|>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "128076": {
612
+ "content": "<|reserved_special_token_68|>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "128077": {
620
+ "content": "<|reserved_special_token_69|>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "128078": {
628
+ "content": "<|reserved_special_token_70|>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "128079": {
636
+ "content": "<|reserved_special_token_71|>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "128080": {
644
+ "content": "<|reserved_special_token_72|>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "128081": {
652
+ "content": "<|reserved_special_token_73|>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "128082": {
660
+ "content": "<|reserved_special_token_74|>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "128083": {
668
+ "content": "<|reserved_special_token_75|>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "128084": {
676
+ "content": "<|reserved_special_token_76|>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "128085": {
684
+ "content": "<|reserved_special_token_77|>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "128086": {
692
+ "content": "<|reserved_special_token_78|>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "128087": {
700
+ "content": "<|reserved_special_token_79|>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "128088": {
708
+ "content": "<|reserved_special_token_80|>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "128089": {
716
+ "content": "<|reserved_special_token_81|>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "128090": {
724
+ "content": "<|reserved_special_token_82|>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "128091": {
732
+ "content": "<|reserved_special_token_83|>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "128092": {
740
+ "content": "<|reserved_special_token_84|>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "128093": {
748
+ "content": "<|reserved_special_token_85|>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "128094": {
756
+ "content": "<|reserved_special_token_86|>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "128095": {
764
+ "content": "<|reserved_special_token_87|>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "128096": {
772
+ "content": "<|reserved_special_token_88|>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "128097": {
780
+ "content": "<|reserved_special_token_89|>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "128098": {
788
+ "content": "<|reserved_special_token_90|>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "128099": {
796
+ "content": "<|reserved_special_token_91|>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "128100": {
804
+ "content": "<|reserved_special_token_92|>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "128101": {
812
+ "content": "<|reserved_special_token_93|>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "128102": {
820
+ "content": "<|reserved_special_token_94|>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ },
827
+ "128103": {
828
+ "content": "<|reserved_special_token_95|>",
829
+ "lstrip": false,
830
+ "normalized": false,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": true
834
+ },
835
+ "128104": {
836
+ "content": "<|reserved_special_token_96|>",
837
+ "lstrip": false,
838
+ "normalized": false,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": true
842
+ },
843
+ "128105": {
844
+ "content": "<|reserved_special_token_97|>",
845
+ "lstrip": false,
846
+ "normalized": false,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": true
850
+ },
851
+ "128106": {
852
+ "content": "<|reserved_special_token_98|>",
853
+ "lstrip": false,
854
+ "normalized": false,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": true
858
+ },
859
+ "128107": {
860
+ "content": "<|reserved_special_token_99|>",
861
+ "lstrip": false,
862
+ "normalized": false,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": true
866
+ },
867
+ "128108": {
868
+ "content": "<|reserved_special_token_100|>",
869
+ "lstrip": false,
870
+ "normalized": false,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": true
874
+ },
875
+ "128109": {
876
+ "content": "<|reserved_special_token_101|>",
877
+ "lstrip": false,
878
+ "normalized": false,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": true
882
+ },
883
+ "128110": {
884
+ "content": "<|reserved_special_token_102|>",
885
+ "lstrip": false,
886
+ "normalized": false,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": true
890
+ },
891
+ "128111": {
892
+ "content": "<|reserved_special_token_103|>",
893
+ "lstrip": false,
894
+ "normalized": false,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": true
898
+ },
899
+ "128112": {
900
+ "content": "<|reserved_special_token_104|>",
901
+ "lstrip": false,
902
+ "normalized": false,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": true
906
+ },
907
+ "128113": {
908
+ "content": "<|reserved_special_token_105|>",
909
+ "lstrip": false,
910
+ "normalized": false,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": true
914
+ },
915
+ "128114": {
916
+ "content": "<|reserved_special_token_106|>",
917
+ "lstrip": false,
918
+ "normalized": false,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": true
922
+ },
923
+ "128115": {
924
+ "content": "<|reserved_special_token_107|>",
925
+ "lstrip": false,
926
+ "normalized": false,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": true
930
+ },
931
+ "128116": {
932
+ "content": "<|reserved_special_token_108|>",
933
+ "lstrip": false,
934
+ "normalized": false,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": true
938
+ },
939
+ "128117": {
940
+ "content": "<|reserved_special_token_109|>",
941
+ "lstrip": false,
942
+ "normalized": false,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": true
946
+ },
947
+ "128118": {
948
+ "content": "<|reserved_special_token_110|>",
949
+ "lstrip": false,
950
+ "normalized": false,
951
+ "rstrip": false,
952
+ "single_word": false,
953
+ "special": true
954
+ },
955
+ "128119": {
956
+ "content": "<|reserved_special_token_111|>",
957
+ "lstrip": false,
958
+ "normalized": false,
959
+ "rstrip": false,
960
+ "single_word": false,
961
+ "special": true
962
+ },
963
+ "128120": {
964
+ "content": "<|reserved_special_token_112|>",
965
+ "lstrip": false,
966
+ "normalized": false,
967
+ "rstrip": false,
968
+ "single_word": false,
969
+ "special": true
970
+ },
971
+ "128121": {
972
+ "content": "<|reserved_special_token_113|>",
973
+ "lstrip": false,
974
+ "normalized": false,
975
+ "rstrip": false,
976
+ "single_word": false,
977
+ "special": true
978
+ },
979
+ "128122": {
980
+ "content": "<|reserved_special_token_114|>",
981
+ "lstrip": false,
982
+ "normalized": false,
983
+ "rstrip": false,
984
+ "single_word": false,
985
+ "special": true
986
+ },
987
+ "128123": {
988
+ "content": "<|reserved_special_token_115|>",
989
+ "lstrip": false,
990
+ "normalized": false,
991
+ "rstrip": false,
992
+ "single_word": false,
993
+ "special": true
994
+ },
995
+ "128124": {
996
+ "content": "<|reserved_special_token_116|>",
997
+ "lstrip": false,
998
+ "normalized": false,
999
+ "rstrip": false,
1000
+ "single_word": false,
1001
+ "special": true
1002
+ },
1003
+ "128125": {
1004
+ "content": "<|reserved_special_token_117|>",
1005
+ "lstrip": false,
1006
+ "normalized": false,
1007
+ "rstrip": false,
1008
+ "single_word": false,
1009
+ "special": true
1010
+ },
1011
+ "128126": {
1012
+ "content": "<|reserved_special_token_118|>",
1013
+ "lstrip": false,
1014
+ "normalized": false,
1015
+ "rstrip": false,
1016
+ "single_word": false,
1017
+ "special": true
1018
+ },
1019
+ "128127": {
1020
+ "content": "<|reserved_special_token_119|>",
1021
+ "lstrip": false,
1022
+ "normalized": false,
1023
+ "rstrip": false,
1024
+ "single_word": false,
1025
+ "special": true
1026
+ },
1027
+ "128128": {
1028
+ "content": "<|reserved_special_token_120|>",
1029
+ "lstrip": false,
1030
+ "normalized": false,
1031
+ "rstrip": false,
1032
+ "single_word": false,
1033
+ "special": true
1034
+ },
1035
+ "128129": {
1036
+ "content": "<|reserved_special_token_121|>",
1037
+ "lstrip": false,
1038
+ "normalized": false,
1039
+ "rstrip": false,
1040
+ "single_word": false,
1041
+ "special": true
1042
+ },
1043
+ "128130": {
1044
+ "content": "<|reserved_special_token_122|>",
1045
+ "lstrip": false,
1046
+ "normalized": false,
1047
+ "rstrip": false,
1048
+ "single_word": false,
1049
+ "special": true
1050
+ },
1051
+ "128131": {
1052
+ "content": "<|reserved_special_token_123|>",
1053
+ "lstrip": false,
1054
+ "normalized": false,
1055
+ "rstrip": false,
1056
+ "single_word": false,
1057
+ "special": true
1058
+ },
1059
+ "128132": {
1060
+ "content": "<|reserved_special_token_124|>",
1061
+ "lstrip": false,
1062
+ "normalized": false,
1063
+ "rstrip": false,
1064
+ "single_word": false,
1065
+ "special": true
1066
+ },
1067
+ "128133": {
1068
+ "content": "<|reserved_special_token_125|>",
1069
+ "lstrip": false,
1070
+ "normalized": false,
1071
+ "rstrip": false,
1072
+ "single_word": false,
1073
+ "special": true
1074
+ },
1075
+ "128134": {
1076
+ "content": "<|reserved_special_token_126|>",
1077
+ "lstrip": false,
1078
+ "normalized": false,
1079
+ "rstrip": false,
1080
+ "single_word": false,
1081
+ "special": true
1082
+ },
1083
+ "128135": {
1084
+ "content": "<|reserved_special_token_127|>",
1085
+ "lstrip": false,
1086
+ "normalized": false,
1087
+ "rstrip": false,
1088
+ "single_word": false,
1089
+ "special": true
1090
+ },
1091
+ "128136": {
1092
+ "content": "<|reserved_special_token_128|>",
1093
+ "lstrip": false,
1094
+ "normalized": false,
1095
+ "rstrip": false,
1096
+ "single_word": false,
1097
+ "special": true
1098
+ },
1099
+ "128137": {
1100
+ "content": "<|reserved_special_token_129|>",
1101
+ "lstrip": false,
1102
+ "normalized": false,
1103
+ "rstrip": false,
1104
+ "single_word": false,
1105
+ "special": true
1106
+ },
1107
+ "128138": {
1108
+ "content": "<|reserved_special_token_130|>",
1109
+ "lstrip": false,
1110
+ "normalized": false,
1111
+ "rstrip": false,
1112
+ "single_word": false,
1113
+ "special": true
1114
+ },
1115
+ "128139": {
1116
+ "content": "<|reserved_special_token_131|>",
1117
+ "lstrip": false,
1118
+ "normalized": false,
1119
+ "rstrip": false,
1120
+ "single_word": false,
1121
+ "special": true
1122
+ },
1123
+ "128140": {
1124
+ "content": "<|reserved_special_token_132|>",
1125
+ "lstrip": false,
1126
+ "normalized": false,
1127
+ "rstrip": false,
1128
+ "single_word": false,
1129
+ "special": true
1130
+ },
1131
+ "128141": {
1132
+ "content": "<|reserved_special_token_133|>",
1133
+ "lstrip": false,
1134
+ "normalized": false,
1135
+ "rstrip": false,
1136
+ "single_word": false,
1137
+ "special": true
1138
+ },
1139
+ "128142": {
1140
+ "content": "<|reserved_special_token_134|>",
1141
+ "lstrip": false,
1142
+ "normalized": false,
1143
+ "rstrip": false,
1144
+ "single_word": false,
1145
+ "special": true
1146
+ },
1147
+ "128143": {
1148
+ "content": "<|reserved_special_token_135|>",
1149
+ "lstrip": false,
1150
+ "normalized": false,
1151
+ "rstrip": false,
1152
+ "single_word": false,
1153
+ "special": true
1154
+ },
1155
+ "128144": {
1156
+ "content": "<|reserved_special_token_136|>",
1157
+ "lstrip": false,
1158
+ "normalized": false,
1159
+ "rstrip": false,
1160
+ "single_word": false,
1161
+ "special": true
1162
+ },
1163
+ "128145": {
1164
+ "content": "<|reserved_special_token_137|>",
1165
+ "lstrip": false,
1166
+ "normalized": false,
1167
+ "rstrip": false,
1168
+ "single_word": false,
1169
+ "special": true
1170
+ },
1171
+ "128146": {
1172
+ "content": "<|reserved_special_token_138|>",
1173
+ "lstrip": false,
1174
+ "normalized": false,
1175
+ "rstrip": false,
1176
+ "single_word": false,
1177
+ "special": true
1178
+ },
1179
+ "128147": {
1180
+ "content": "<|reserved_special_token_139|>",
1181
+ "lstrip": false,
1182
+ "normalized": false,
1183
+ "rstrip": false,
1184
+ "single_word": false,
1185
+ "special": true
1186
+ },
1187
+ "128148": {
1188
+ "content": "<|reserved_special_token_140|>",
1189
+ "lstrip": false,
1190
+ "normalized": false,
1191
+ "rstrip": false,
1192
+ "single_word": false,
1193
+ "special": true
1194
+ },
1195
+ "128149": {
1196
+ "content": "<|reserved_special_token_141|>",
1197
+ "lstrip": false,
1198
+ "normalized": false,
1199
+ "rstrip": false,
1200
+ "single_word": false,
1201
+ "special": true
1202
+ },
1203
+ "128150": {
1204
+ "content": "<|reserved_special_token_142|>",
1205
+ "lstrip": false,
1206
+ "normalized": false,
1207
+ "rstrip": false,
1208
+ "single_word": false,
1209
+ "special": true
1210
+ },
1211
+ "128151": {
1212
+ "content": "<|reserved_special_token_143|>",
1213
+ "lstrip": false,
1214
+ "normalized": false,
1215
+ "rstrip": false,
1216
+ "single_word": false,
1217
+ "special": true
1218
+ },
1219
+ "128152": {
1220
+ "content": "<|reserved_special_token_144|>",
1221
+ "lstrip": false,
1222
+ "normalized": false,
1223
+ "rstrip": false,
1224
+ "single_word": false,
1225
+ "special": true
1226
+ },
1227
+ "128153": {
1228
+ "content": "<|reserved_special_token_145|>",
1229
+ "lstrip": false,
1230
+ "normalized": false,
1231
+ "rstrip": false,
1232
+ "single_word": false,
1233
+ "special": true
1234
+ },
1235
+ "128154": {
1236
+ "content": "<|reserved_special_token_146|>",
1237
+ "lstrip": false,
1238
+ "normalized": false,
1239
+ "rstrip": false,
1240
+ "single_word": false,
1241
+ "special": true
1242
+ },
1243
+ "128155": {
1244
+ "content": "<|reserved_special_token_147|>",
1245
+ "lstrip": false,
1246
+ "normalized": false,
1247
+ "rstrip": false,
1248
+ "single_word": false,
1249
+ "special": true
1250
+ },
1251
+ "128156": {
1252
+ "content": "<|reserved_special_token_148|>",
1253
+ "lstrip": false,
1254
+ "normalized": false,
1255
+ "rstrip": false,
1256
+ "single_word": false,
1257
+ "special": true
1258
+ },
1259
+ "128157": {
1260
+ "content": "<|reserved_special_token_149|>",
1261
+ "lstrip": false,
1262
+ "normalized": false,
1263
+ "rstrip": false,
1264
+ "single_word": false,
1265
+ "special": true
1266
+ },
1267
+ "128158": {
1268
+ "content": "<|reserved_special_token_150|>",
1269
+ "lstrip": false,
1270
+ "normalized": false,
1271
+ "rstrip": false,
1272
+ "single_word": false,
1273
+ "special": true
1274
+ },
1275
+ "128159": {
1276
+ "content": "<|reserved_special_token_151|>",
1277
+ "lstrip": false,
1278
+ "normalized": false,
1279
+ "rstrip": false,
1280
+ "single_word": false,
1281
+ "special": true
1282
+ },
1283
+ "128160": {
1284
+ "content": "<|reserved_special_token_152|>",
1285
+ "lstrip": false,
1286
+ "normalized": false,
1287
+ "rstrip": false,
1288
+ "single_word": false,
1289
+ "special": true
1290
+ },
1291
+ "128161": {
1292
+ "content": "<|reserved_special_token_153|>",
1293
+ "lstrip": false,
1294
+ "normalized": false,
1295
+ "rstrip": false,
1296
+ "single_word": false,
1297
+ "special": true
1298
+ },
1299
+ "128162": {
1300
+ "content": "<|reserved_special_token_154|>",
1301
+ "lstrip": false,
1302
+ "normalized": false,
1303
+ "rstrip": false,
1304
+ "single_word": false,
1305
+ "special": true
1306
+ },
1307
+ "128163": {
1308
+ "content": "<|reserved_special_token_155|>",
1309
+ "lstrip": false,
1310
+ "normalized": false,
1311
+ "rstrip": false,
1312
+ "single_word": false,
1313
+ "special": true
1314
+ },
1315
+ "128164": {
1316
+ "content": "<|reserved_special_token_156|>",
1317
+ "lstrip": false,
1318
+ "normalized": false,
1319
+ "rstrip": false,
1320
+ "single_word": false,
1321
+ "special": true
1322
+ },
1323
+ "128165": {
1324
+ "content": "<|reserved_special_token_157|>",
1325
+ "lstrip": false,
1326
+ "normalized": false,
1327
+ "rstrip": false,
1328
+ "single_word": false,
1329
+ "special": true
1330
+ },
1331
+ "128166": {
1332
+ "content": "<|reserved_special_token_158|>",
1333
+ "lstrip": false,
1334
+ "normalized": false,
1335
+ "rstrip": false,
1336
+ "single_word": false,
1337
+ "special": true
1338
+ },
1339
+ "128167": {
1340
+ "content": "<|reserved_special_token_159|>",
1341
+ "lstrip": false,
1342
+ "normalized": false,
1343
+ "rstrip": false,
1344
+ "single_word": false,
1345
+ "special": true
1346
+ },
1347
+ "128168": {
1348
+ "content": "<|reserved_special_token_160|>",
1349
+ "lstrip": false,
1350
+ "normalized": false,
1351
+ "rstrip": false,
1352
+ "single_word": false,
1353
+ "special": true
1354
+ },
1355
+ "128169": {
1356
+ "content": "<|reserved_special_token_161|>",
1357
+ "lstrip": false,
1358
+ "normalized": false,
1359
+ "rstrip": false,
1360
+ "single_word": false,
1361
+ "special": true
1362
+ },
1363
+ "128170": {
1364
+ "content": "<|reserved_special_token_162|>",
1365
+ "lstrip": false,
1366
+ "normalized": false,
1367
+ "rstrip": false,
1368
+ "single_word": false,
1369
+ "special": true
1370
+ },
1371
+ "128171": {
1372
+ "content": "<|reserved_special_token_163|>",
1373
+ "lstrip": false,
1374
+ "normalized": false,
1375
+ "rstrip": false,
1376
+ "single_word": false,
1377
+ "special": true
1378
+ },
1379
+ "128172": {
1380
+ "content": "<|reserved_special_token_164|>",
1381
+ "lstrip": false,
1382
+ "normalized": false,
1383
+ "rstrip": false,
1384
+ "single_word": false,
1385
+ "special": true
1386
+ },
1387
+ "128173": {
1388
+ "content": "<|reserved_special_token_165|>",
1389
+ "lstrip": false,
1390
+ "normalized": false,
1391
+ "rstrip": false,
1392
+ "single_word": false,
1393
+ "special": true
1394
+ },
1395
+ "128174": {
1396
+ "content": "<|reserved_special_token_166|>",
1397
+ "lstrip": false,
1398
+ "normalized": false,
1399
+ "rstrip": false,
1400
+ "single_word": false,
1401
+ "special": true
1402
+ },
1403
+ "128175": {
1404
+ "content": "<|reserved_special_token_167|>",
1405
+ "lstrip": false,
1406
+ "normalized": false,
1407
+ "rstrip": false,
1408
+ "single_word": false,
1409
+ "special": true
1410
+ },
1411
+ "128176": {
1412
+ "content": "<|reserved_special_token_168|>",
1413
+ "lstrip": false,
1414
+ "normalized": false,
1415
+ "rstrip": false,
1416
+ "single_word": false,
1417
+ "special": true
1418
+ },
1419
+ "128177": {
1420
+ "content": "<|reserved_special_token_169|>",
1421
+ "lstrip": false,
1422
+ "normalized": false,
1423
+ "rstrip": false,
1424
+ "single_word": false,
1425
+ "special": true
1426
+ },
1427
+ "128178": {
1428
+ "content": "<|reserved_special_token_170|>",
1429
+ "lstrip": false,
1430
+ "normalized": false,
1431
+ "rstrip": false,
1432
+ "single_word": false,
1433
+ "special": true
1434
+ },
1435
+ "128179": {
1436
+ "content": "<|reserved_special_token_171|>",
1437
+ "lstrip": false,
1438
+ "normalized": false,
1439
+ "rstrip": false,
1440
+ "single_word": false,
1441
+ "special": true
1442
+ },
1443
+ "128180": {
1444
+ "content": "<|reserved_special_token_172|>",
1445
+ "lstrip": false,
1446
+ "normalized": false,
1447
+ "rstrip": false,
1448
+ "single_word": false,
1449
+ "special": true
1450
+ },
1451
+ "128181": {
1452
+ "content": "<|reserved_special_token_173|>",
1453
+ "lstrip": false,
1454
+ "normalized": false,
1455
+ "rstrip": false,
1456
+ "single_word": false,
1457
+ "special": true
1458
+ },
1459
+ "128182": {
1460
+ "content": "<|reserved_special_token_174|>",
1461
+ "lstrip": false,
1462
+ "normalized": false,
1463
+ "rstrip": false,
1464
+ "single_word": false,
1465
+ "special": true
1466
+ },
1467
+ "128183": {
1468
+ "content": "<|reserved_special_token_175|>",
1469
+ "lstrip": false,
1470
+ "normalized": false,
1471
+ "rstrip": false,
1472
+ "single_word": false,
1473
+ "special": true
1474
+ },
1475
+ "128184": {
1476
+ "content": "<|reserved_special_token_176|>",
1477
+ "lstrip": false,
1478
+ "normalized": false,
1479
+ "rstrip": false,
1480
+ "single_word": false,
1481
+ "special": true
1482
+ },
1483
+ "128185": {
1484
+ "content": "<|reserved_special_token_177|>",
1485
+ "lstrip": false,
1486
+ "normalized": false,
1487
+ "rstrip": false,
1488
+ "single_word": false,
1489
+ "special": true
1490
+ },
1491
+ "128186": {
1492
+ "content": "<|reserved_special_token_178|>",
1493
+ "lstrip": false,
1494
+ "normalized": false,
1495
+ "rstrip": false,
1496
+ "single_word": false,
1497
+ "special": true
1498
+ },
1499
+ "128187": {
1500
+ "content": "<|reserved_special_token_179|>",
1501
+ "lstrip": false,
1502
+ "normalized": false,
1503
+ "rstrip": false,
1504
+ "single_word": false,
1505
+ "special": true
1506
+ },
1507
+ "128188": {
1508
+ "content": "<|reserved_special_token_180|>",
1509
+ "lstrip": false,
1510
+ "normalized": false,
1511
+ "rstrip": false,
1512
+ "single_word": false,
1513
+ "special": true
1514
+ },
1515
+ "128189": {
1516
+ "content": "<|reserved_special_token_181|>",
1517
+ "lstrip": false,
1518
+ "normalized": false,
1519
+ "rstrip": false,
1520
+ "single_word": false,
1521
+ "special": true
1522
+ },
1523
+ "128190": {
1524
+ "content": "<|reserved_special_token_182|>",
1525
+ "lstrip": false,
1526
+ "normalized": false,
1527
+ "rstrip": false,
1528
+ "single_word": false,
1529
+ "special": true
1530
+ },
1531
+ "128191": {
1532
+ "content": "<|reserved_special_token_183|>",
1533
+ "lstrip": false,
1534
+ "normalized": false,
1535
+ "rstrip": false,
1536
+ "single_word": false,
1537
+ "special": true
1538
+ },
1539
+ "128192": {
1540
+ "content": "<|reserved_special_token_184|>",
1541
+ "lstrip": false,
1542
+ "normalized": false,
1543
+ "rstrip": false,
1544
+ "single_word": false,
1545
+ "special": true
1546
+ },
1547
+ "128193": {
1548
+ "content": "<|reserved_special_token_185|>",
1549
+ "lstrip": false,
1550
+ "normalized": false,
1551
+ "rstrip": false,
1552
+ "single_word": false,
1553
+ "special": true
1554
+ },
1555
+ "128194": {
1556
+ "content": "<|reserved_special_token_186|>",
1557
+ "lstrip": false,
1558
+ "normalized": false,
1559
+ "rstrip": false,
1560
+ "single_word": false,
1561
+ "special": true
1562
+ },
1563
+ "128195": {
1564
+ "content": "<|reserved_special_token_187|>",
1565
+ "lstrip": false,
1566
+ "normalized": false,
1567
+ "rstrip": false,
1568
+ "single_word": false,
1569
+ "special": true
1570
+ },
1571
+ "128196": {
1572
+ "content": "<|reserved_special_token_188|>",
1573
+ "lstrip": false,
1574
+ "normalized": false,
1575
+ "rstrip": false,
1576
+ "single_word": false,
1577
+ "special": true
1578
+ },
1579
+ "128197": {
1580
+ "content": "<|reserved_special_token_189|>",
1581
+ "lstrip": false,
1582
+ "normalized": false,
1583
+ "rstrip": false,
1584
+ "single_word": false,
1585
+ "special": true
1586
+ },
1587
+ "128198": {
1588
+ "content": "<|reserved_special_token_190|>",
1589
+ "lstrip": false,
1590
+ "normalized": false,
1591
+ "rstrip": false,
1592
+ "single_word": false,
1593
+ "special": true
1594
+ },
1595
+ "128199": {
1596
+ "content": "<|reserved_special_token_191|>",
1597
+ "lstrip": false,
1598
+ "normalized": false,
1599
+ "rstrip": false,
1600
+ "single_word": false,
1601
+ "special": true
1602
+ },
1603
+ "128200": {
1604
+ "content": "<|reserved_special_token_192|>",
1605
+ "lstrip": false,
1606
+ "normalized": false,
1607
+ "rstrip": false,
1608
+ "single_word": false,
1609
+ "special": true
1610
+ },
1611
+ "128201": {
1612
+ "content": "<|reserved_special_token_193|>",
1613
+ "lstrip": false,
1614
+ "normalized": false,
1615
+ "rstrip": false,
1616
+ "single_word": false,
1617
+ "special": true
1618
+ },
1619
+ "128202": {
1620
+ "content": "<|reserved_special_token_194|>",
1621
+ "lstrip": false,
1622
+ "normalized": false,
1623
+ "rstrip": false,
1624
+ "single_word": false,
1625
+ "special": true
1626
+ },
1627
+ "128203": {
1628
+ "content": "<|reserved_special_token_195|>",
1629
+ "lstrip": false,
1630
+ "normalized": false,
1631
+ "rstrip": false,
1632
+ "single_word": false,
1633
+ "special": true
1634
+ },
1635
+ "128204": {
1636
+ "content": "<|reserved_special_token_196|>",
1637
+ "lstrip": false,
1638
+ "normalized": false,
1639
+ "rstrip": false,
1640
+ "single_word": false,
1641
+ "special": true
1642
+ },
1643
+ "128205": {
1644
+ "content": "<|reserved_special_token_197|>",
1645
+ "lstrip": false,
1646
+ "normalized": false,
1647
+ "rstrip": false,
1648
+ "single_word": false,
1649
+ "special": true
1650
+ },
1651
+ "128206": {
1652
+ "content": "<|reserved_special_token_198|>",
1653
+ "lstrip": false,
1654
+ "normalized": false,
1655
+ "rstrip": false,
1656
+ "single_word": false,
1657
+ "special": true
1658
+ },
1659
+ "128207": {
1660
+ "content": "<|reserved_special_token_199|>",
1661
+ "lstrip": false,
1662
+ "normalized": false,
1663
+ "rstrip": false,
1664
+ "single_word": false,
1665
+ "special": true
1666
+ },
1667
+ "128208": {
1668
+ "content": "<|reserved_special_token_200|>",
1669
+ "lstrip": false,
1670
+ "normalized": false,
1671
+ "rstrip": false,
1672
+ "single_word": false,
1673
+ "special": true
1674
+ },
1675
+ "128209": {
1676
+ "content": "<|reserved_special_token_201|>",
1677
+ "lstrip": false,
1678
+ "normalized": false,
1679
+ "rstrip": false,
1680
+ "single_word": false,
1681
+ "special": true
1682
+ },
1683
+ "128210": {
1684
+ "content": "<|reserved_special_token_202|>",
1685
+ "lstrip": false,
1686
+ "normalized": false,
1687
+ "rstrip": false,
1688
+ "single_word": false,
1689
+ "special": true
1690
+ },
1691
+ "128211": {
1692
+ "content": "<|reserved_special_token_203|>",
1693
+ "lstrip": false,
1694
+ "normalized": false,
1695
+ "rstrip": false,
1696
+ "single_word": false,
1697
+ "special": true
1698
+ },
1699
+ "128212": {
1700
+ "content": "<|reserved_special_token_204|>",
1701
+ "lstrip": false,
1702
+ "normalized": false,
1703
+ "rstrip": false,
1704
+ "single_word": false,
1705
+ "special": true
1706
+ },
1707
+ "128213": {
1708
+ "content": "<|reserved_special_token_205|>",
1709
+ "lstrip": false,
1710
+ "normalized": false,
1711
+ "rstrip": false,
1712
+ "single_word": false,
1713
+ "special": true
1714
+ },
1715
+ "128214": {
1716
+ "content": "<|reserved_special_token_206|>",
1717
+ "lstrip": false,
1718
+ "normalized": false,
1719
+ "rstrip": false,
1720
+ "single_word": false,
1721
+ "special": true
1722
+ },
1723
+ "128215": {
1724
+ "content": "<|reserved_special_token_207|>",
1725
+ "lstrip": false,
1726
+ "normalized": false,
1727
+ "rstrip": false,
1728
+ "single_word": false,
1729
+ "special": true
1730
+ },
1731
+ "128216": {
1732
+ "content": "<|reserved_special_token_208|>",
1733
+ "lstrip": false,
1734
+ "normalized": false,
1735
+ "rstrip": false,
1736
+ "single_word": false,
1737
+ "special": true
1738
+ },
1739
+ "128217": {
1740
+ "content": "<|reserved_special_token_209|>",
1741
+ "lstrip": false,
1742
+ "normalized": false,
1743
+ "rstrip": false,
1744
+ "single_word": false,
1745
+ "special": true
1746
+ },
1747
+ "128218": {
1748
+ "content": "<|reserved_special_token_210|>",
1749
+ "lstrip": false,
1750
+ "normalized": false,
1751
+ "rstrip": false,
1752
+ "single_word": false,
1753
+ "special": true
1754
+ },
1755
+ "128219": {
1756
+ "content": "<|reserved_special_token_211|>",
1757
+ "lstrip": false,
1758
+ "normalized": false,
1759
+ "rstrip": false,
1760
+ "single_word": false,
1761
+ "special": true
1762
+ },
1763
+ "128220": {
1764
+ "content": "<|reserved_special_token_212|>",
1765
+ "lstrip": false,
1766
+ "normalized": false,
1767
+ "rstrip": false,
1768
+ "single_word": false,
1769
+ "special": true
1770
+ },
1771
+ "128221": {
1772
+ "content": "<|reserved_special_token_213|>",
1773
+ "lstrip": false,
1774
+ "normalized": false,
1775
+ "rstrip": false,
1776
+ "single_word": false,
1777
+ "special": true
1778
+ },
1779
+ "128222": {
1780
+ "content": "<|reserved_special_token_214|>",
1781
+ "lstrip": false,
1782
+ "normalized": false,
1783
+ "rstrip": false,
1784
+ "single_word": false,
1785
+ "special": true
1786
+ },
1787
+ "128223": {
1788
+ "content": "<|reserved_special_token_215|>",
1789
+ "lstrip": false,
1790
+ "normalized": false,
1791
+ "rstrip": false,
1792
+ "single_word": false,
1793
+ "special": true
1794
+ },
1795
+ "128224": {
1796
+ "content": "<|reserved_special_token_216|>",
1797
+ "lstrip": false,
1798
+ "normalized": false,
1799
+ "rstrip": false,
1800
+ "single_word": false,
1801
+ "special": true
1802
+ },
1803
+ "128225": {
1804
+ "content": "<|reserved_special_token_217|>",
1805
+ "lstrip": false,
1806
+ "normalized": false,
1807
+ "rstrip": false,
1808
+ "single_word": false,
1809
+ "special": true
1810
+ },
1811
+ "128226": {
1812
+ "content": "<|reserved_special_token_218|>",
1813
+ "lstrip": false,
1814
+ "normalized": false,
1815
+ "rstrip": false,
1816
+ "single_word": false,
1817
+ "special": true
1818
+ },
1819
+ "128227": {
1820
+ "content": "<|reserved_special_token_219|>",
1821
+ "lstrip": false,
1822
+ "normalized": false,
1823
+ "rstrip": false,
1824
+ "single_word": false,
1825
+ "special": true
1826
+ },
1827
+ "128228": {
1828
+ "content": "<|reserved_special_token_220|>",
1829
+ "lstrip": false,
1830
+ "normalized": false,
1831
+ "rstrip": false,
1832
+ "single_word": false,
1833
+ "special": true
1834
+ },
1835
+ "128229": {
1836
+ "content": "<|reserved_special_token_221|>",
1837
+ "lstrip": false,
1838
+ "normalized": false,
1839
+ "rstrip": false,
1840
+ "single_word": false,
1841
+ "special": true
1842
+ },
1843
+ "128230": {
1844
+ "content": "<|reserved_special_token_222|>",
1845
+ "lstrip": false,
1846
+ "normalized": false,
1847
+ "rstrip": false,
1848
+ "single_word": false,
1849
+ "special": true
1850
+ },
1851
+ "128231": {
1852
+ "content": "<|reserved_special_token_223|>",
1853
+ "lstrip": false,
1854
+ "normalized": false,
1855
+ "rstrip": false,
1856
+ "single_word": false,
1857
+ "special": true
1858
+ },
1859
+ "128232": {
1860
+ "content": "<|reserved_special_token_224|>",
1861
+ "lstrip": false,
1862
+ "normalized": false,
1863
+ "rstrip": false,
1864
+ "single_word": false,
1865
+ "special": true
1866
+ },
1867
+ "128233": {
1868
+ "content": "<|reserved_special_token_225|>",
1869
+ "lstrip": false,
1870
+ "normalized": false,
1871
+ "rstrip": false,
1872
+ "single_word": false,
1873
+ "special": true
1874
+ },
1875
+ "128234": {
1876
+ "content": "<|reserved_special_token_226|>",
1877
+ "lstrip": false,
1878
+ "normalized": false,
1879
+ "rstrip": false,
1880
+ "single_word": false,
1881
+ "special": true
1882
+ },
1883
+ "128235": {
1884
+ "content": "<|reserved_special_token_227|>",
1885
+ "lstrip": false,
1886
+ "normalized": false,
1887
+ "rstrip": false,
1888
+ "single_word": false,
1889
+ "special": true
1890
+ },
1891
+ "128236": {
1892
+ "content": "<|reserved_special_token_228|>",
1893
+ "lstrip": false,
1894
+ "normalized": false,
1895
+ "rstrip": false,
1896
+ "single_word": false,
1897
+ "special": true
1898
+ },
1899
+ "128237": {
1900
+ "content": "<|reserved_special_token_229|>",
1901
+ "lstrip": false,
1902
+ "normalized": false,
1903
+ "rstrip": false,
1904
+ "single_word": false,
1905
+ "special": true
1906
+ },
1907
+ "128238": {
1908
+ "content": "<|reserved_special_token_230|>",
1909
+ "lstrip": false,
1910
+ "normalized": false,
1911
+ "rstrip": false,
1912
+ "single_word": false,
1913
+ "special": true
1914
+ },
1915
+ "128239": {
1916
+ "content": "<|reserved_special_token_231|>",
1917
+ "lstrip": false,
1918
+ "normalized": false,
1919
+ "rstrip": false,
1920
+ "single_word": false,
1921
+ "special": true
1922
+ },
1923
+ "128240": {
1924
+ "content": "<|reserved_special_token_232|>",
1925
+ "lstrip": false,
1926
+ "normalized": false,
1927
+ "rstrip": false,
1928
+ "single_word": false,
1929
+ "special": true
1930
+ },
1931
+ "128241": {
1932
+ "content": "<|reserved_special_token_233|>",
1933
+ "lstrip": false,
1934
+ "normalized": false,
1935
+ "rstrip": false,
1936
+ "single_word": false,
1937
+ "special": true
1938
+ },
1939
+ "128242": {
1940
+ "content": "<|reserved_special_token_234|>",
1941
+ "lstrip": false,
1942
+ "normalized": false,
1943
+ "rstrip": false,
1944
+ "single_word": false,
1945
+ "special": true
1946
+ },
1947
+ "128243": {
1948
+ "content": "<|reserved_special_token_235|>",
1949
+ "lstrip": false,
1950
+ "normalized": false,
1951
+ "rstrip": false,
1952
+ "single_word": false,
1953
+ "special": true
1954
+ },
1955
+ "128244": {
1956
+ "content": "<|reserved_special_token_236|>",
1957
+ "lstrip": false,
1958
+ "normalized": false,
1959
+ "rstrip": false,
1960
+ "single_word": false,
1961
+ "special": true
1962
+ },
1963
+ "128245": {
1964
+ "content": "<|reserved_special_token_237|>",
1965
+ "lstrip": false,
1966
+ "normalized": false,
1967
+ "rstrip": false,
1968
+ "single_word": false,
1969
+ "special": true
1970
+ },
1971
+ "128246": {
1972
+ "content": "<|reserved_special_token_238|>",
1973
+ "lstrip": false,
1974
+ "normalized": false,
1975
+ "rstrip": false,
1976
+ "single_word": false,
1977
+ "special": true
1978
+ },
1979
+ "128247": {
1980
+ "content": "<|reserved_special_token_239|>",
1981
+ "lstrip": false,
1982
+ "normalized": false,
1983
+ "rstrip": false,
1984
+ "single_word": false,
1985
+ "special": true
1986
+ },
1987
+ "128248": {
1988
+ "content": "<|reserved_special_token_240|>",
1989
+ "lstrip": false,
1990
+ "normalized": false,
1991
+ "rstrip": false,
1992
+ "single_word": false,
1993
+ "special": true
1994
+ },
1995
+ "128249": {
1996
+ "content": "<|reserved_special_token_241|>",
1997
+ "lstrip": false,
1998
+ "normalized": false,
1999
+ "rstrip": false,
2000
+ "single_word": false,
2001
+ "special": true
2002
+ },
2003
+ "128250": {
2004
+ "content": "<|reserved_special_token_242|>",
2005
+ "lstrip": false,
2006
+ "normalized": false,
2007
+ "rstrip": false,
2008
+ "single_word": false,
2009
+ "special": true
2010
+ },
2011
+ "128251": {
2012
+ "content": "<|reserved_special_token_243|>",
2013
+ "lstrip": false,
2014
+ "normalized": false,
2015
+ "rstrip": false,
2016
+ "single_word": false,
2017
+ "special": true
2018
+ },
2019
+ "128252": {
2020
+ "content": "<|reserved_special_token_244|>",
2021
+ "lstrip": false,
2022
+ "normalized": false,
2023
+ "rstrip": false,
2024
+ "single_word": false,
2025
+ "special": true
2026
+ },
2027
+ "128253": {
2028
+ "content": "<|reserved_special_token_245|>",
2029
+ "lstrip": false,
2030
+ "normalized": false,
2031
+ "rstrip": false,
2032
+ "single_word": false,
2033
+ "special": true
2034
+ },
2035
+ "128254": {
2036
+ "content": "<|reserved_special_token_246|>",
2037
+ "lstrip": false,
2038
+ "normalized": false,
2039
+ "rstrip": false,
2040
+ "single_word": false,
2041
+ "special": true
2042
+ },
2043
+ "128255": {
2044
+ "content": "<|reserved_special_token_247|>",
2045
+ "lstrip": false,
2046
+ "normalized": false,
2047
+ "rstrip": false,
2048
+ "single_word": false,
2049
+ "special": true
2050
+ }
2051
+ },
2052
+ "bos_token": "<|begin_of_text|>",
2053
+ "chat_template": "{% set loop_messages = messages %}{% for message in loop_messages %}{% set content = '<|start_header_id|>' + message['role'] + '<|end_header_id|>\n\n'+ message['content'] | trim + '<|eot_id|>' %}{% if loop.index0 == 0 %}{% set content = bos_token + content %}{% endif %}{{ content }}{% endfor %}{{ '<|start_header_id|>assistant<|end_header_id|>\n\n' }}",
2054
+ "clean_up_tokenization_spaces": true,
2055
+ "eos_token": "<|eot_id|>",
2056
+ "model_input_names": [
2057
+ "input_ids",
2058
+ "attention_mask"
2059
+ ],
2060
+ "model_max_length": 131072,
2061
+ "tokenizer_class": "PreTrainedTokenizerFast"
2062
+ }