tensorops commited on
Commit
956bcd8
1 Parent(s): 04846e0

update model to latest

Browse files
.ipynb_checkpoints/README-checkpoint.md ADDED
@@ -0,0 +1,111 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - th
4
+ license: apache-2.0
5
+ library_name: transformers
6
+ tags:
7
+ - whisper-event
8
+ - generated_from_trainer
9
+ datasets:
10
+ - mozilla-foundation/common_voice_13_0
11
+ - google/fleurs
12
+ metrics:
13
+ - wer
14
+ base_model: openai/whisper-small
15
+ model-index:
16
+ - name: Whisper Small Thai Combined V4 - biodatlab
17
+ results:
18
+ - task:
19
+ type: automatic-speech-recognition
20
+ name: Automatic Speech Recognition
21
+ dataset:
22
+ name: mozilla-foundation/common_voice_13_0 th
23
+ type: mozilla-foundation/common_voice_13_0
24
+ config: th
25
+ split: test
26
+ args: th
27
+ metrics:
28
+ - type: wer
29
+ value: 13.14
30
+ name: Wer
31
+ ---
32
+
33
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
34
+ should probably proofread and complete it, then remove this comment. -->
35
+
36
+ # Whisper Small (Thai): Combined V4
37
+
38
+ This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-small) on augmented versions of the mozilla-foundation/common_voice_13_0 th, google/fleurs, and curated datasets.
39
+ It achieves the following results on the common-voice-13 test set:
40
+ - WER: 13.14 (with Deepcut Tokenizer)
41
+
42
+ ## Model description
43
+
44
+ Use the model with huggingface's `transformers` as follows:
45
+
46
+ ```py
47
+ from transformers import pipeline
48
+
49
+ MODEL_NAME = "biodatlab/whisper-th-small-combined" # specify the model name
50
+ lang = "th" # change to Thai langauge
51
+
52
+ device = 0 if torch.cuda.is_available() else "cpu"
53
+
54
+ pipe = pipeline(
55
+ task="automatic-speech-recognition",
56
+ model=MODEL_NAME,
57
+ chunk_length_s=30,
58
+ device=device,
59
+ )
60
+ pipe.model.config.forced_decoder_ids = pipe.tokenizer.get_decoder_prompt_ids(
61
+ language=lang,
62
+ task="transcribe"
63
+ )
64
+ text = pipe("audio.mp3")["text"] # give audio mp3 and transcribe text
65
+ ```
66
+
67
+
68
+ ## Intended uses & limitations
69
+
70
+ More information needed
71
+
72
+ ## Training and evaluation data
73
+
74
+ More information needed
75
+
76
+ ## Training procedure
77
+
78
+ ### Training hyperparameters
79
+
80
+ The following hyperparameters were used during training:
81
+ - learning_rate: 1e-05
82
+ - train_batch_size: 16
83
+ - eval_batch_size: 16
84
+ - seed: 42
85
+ - optimizer: AdamW with betas=(0.9,0.999) and epsilon=1e-08
86
+ - lr_scheduler_type: linear
87
+ - lr_scheduler_warmup_steps: 500
88
+ - training_steps: 10000
89
+ - mixed_precision_training: Native AMP
90
+
91
+ ### Framework versions
92
+
93
+ - Transformers 4.37.2
94
+ - Pytorch 2.1.0
95
+ - Datasets 2.16.1
96
+ - Tokenizers 0.15.1
97
+
98
+ ## Citation
99
+
100
+ Cite using Bibtex:
101
+
102
+ ```
103
+ @misc {thonburian_whisper_med,
104
+ author = { Atirut Boribalburephan, Zaw Htet Aung, Knot Pipatsrisawat, Titipat Achakulvisut },
105
+ title = { Thonburian Whisper: A fine-tuned Whisper model for Thai automatic speech recognition },
106
+ year = 2022,
107
+ url = { https://huggingface.co/biodatlab/whisper-th-medium-combined },
108
+ doi = { 10.57967/hf/0226 },
109
+ publisher = { Hugging Face }
110
+ }
111
+ ```
README.md CHANGED
@@ -2,44 +2,68 @@
2
  language:
3
  - th
4
  license: apache-2.0
 
5
  tags:
6
  - whisper-event
7
  - generated_from_trainer
8
  datasets:
9
- - mozilla-foundation/common_voice_11_0
 
10
  metrics:
11
  - wer
 
12
  model-index:
13
- - name: Whisper Small Thai Combined Concat
14
  results:
15
  - task:
16
- name: Automatic Speech Recognition
17
  type: automatic-speech-recognition
 
18
  dataset:
19
- name: mozilla-foundation/common_voice_11_0 th
20
- type: mozilla-foundation/common_voice_11_0
21
  config: th
22
  split: test
23
  args: th
24
  metrics:
25
- - name: Wer
26
- type: wer
27
- value: 27.279438445464898
28
  ---
29
 
30
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
31
  should probably proofread and complete it, then remove this comment. -->
32
 
33
- # Whisper Small Thai Combined Concat
34
 
35
- This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the mozilla-foundation/common_voice_11_0 th dataset and additional scraped data.
36
- It achieves the following results on the evaluation set:
37
- - Loss: 0.5034
38
- - Wer: 27.2794 (without tokenizer)
39
 
40
  ## Model description
41
 
42
- More information needed
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
43
 
44
  ## Intended uses & limitations
45
 
@@ -55,25 +79,33 @@ More information needed
55
 
56
  The following hyperparameters were used during training:
57
  - learning_rate: 1e-05
58
- - train_batch_size: 64
59
- - eval_batch_size: 32
60
  - seed: 42
61
- - distributed_type: multi-GPU
62
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
63
  - lr_scheduler_type: linear
64
  - lr_scheduler_warmup_steps: 500
65
- - training_steps: 5000
66
-
67
- ### Training results
68
-
69
- | Training Loss | Epoch | Step | Validation Loss | Wer |
70
- |:-------------:|:-----:|:----:|:---------------:|:-------:|
71
- | 0.0002 | 83.33 | 5000 | 0.5034 | 27.2794 |
72
-
73
 
74
  ### Framework versions
75
 
76
- - Transformers 4.27.0.dev0
77
- - Pytorch 1.13.1
78
- - Datasets 2.9.1.dev0
79
- - Tokenizers 0.13.2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  language:
3
  - th
4
  license: apache-2.0
5
+ library_name: transformers
6
  tags:
7
  - whisper-event
8
  - generated_from_trainer
9
  datasets:
10
+ - mozilla-foundation/common_voice_13_0
11
+ - google/fleurs
12
  metrics:
13
  - wer
14
+ base_model: openai/whisper-small
15
  model-index:
16
+ - name: Whisper Small Thai Combined V4 - biodatlab
17
  results:
18
  - task:
 
19
  type: automatic-speech-recognition
20
+ name: Automatic Speech Recognition
21
  dataset:
22
+ name: mozilla-foundation/common_voice_13_0 th
23
+ type: mozilla-foundation/common_voice_13_0
24
  config: th
25
  split: test
26
  args: th
27
  metrics:
28
+ - type: wer
29
+ value: 13.14
30
+ name: Wer
31
  ---
32
 
33
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
34
  should probably proofread and complete it, then remove this comment. -->
35
 
36
+ # Whisper Small (Thai): Combined V4
37
 
38
+ This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-small) on augmented versions of the mozilla-foundation/common_voice_13_0 th, google/fleurs, and curated datasets.
39
+ It achieves the following results on the common-voice-13 test set:
40
+ - WER: 13.14 (with Deepcut Tokenizer)
 
41
 
42
  ## Model description
43
 
44
+ Use the model with huggingface's `transformers` as follows:
45
+
46
+ ```py
47
+ from transformers import pipeline
48
+
49
+ MODEL_NAME = "biodatlab/whisper-th-small-combined" # specify the model name
50
+ lang = "th" # change to Thai langauge
51
+
52
+ device = 0 if torch.cuda.is_available() else "cpu"
53
+
54
+ pipe = pipeline(
55
+ task="automatic-speech-recognition",
56
+ model=MODEL_NAME,
57
+ chunk_length_s=30,
58
+ device=device,
59
+ )
60
+ pipe.model.config.forced_decoder_ids = pipe.tokenizer.get_decoder_prompt_ids(
61
+ language=lang,
62
+ task="transcribe"
63
+ )
64
+ text = pipe("audio.mp3")["text"] # give audio mp3 and transcribe text
65
+ ```
66
+
67
 
68
  ## Intended uses & limitations
69
 
 
79
 
80
  The following hyperparameters were used during training:
81
  - learning_rate: 1e-05
82
+ - train_batch_size: 16
83
+ - eval_batch_size: 16
84
  - seed: 42
85
+ - optimizer: AdamW with betas=(0.9,0.999) and epsilon=1e-08
 
86
  - lr_scheduler_type: linear
87
  - lr_scheduler_warmup_steps: 500
88
+ - training_steps: 10000
89
+ - mixed_precision_training: Native AMP
 
 
 
 
 
 
90
 
91
  ### Framework versions
92
 
93
+ - Transformers 4.37.2
94
+ - Pytorch 2.1.0
95
+ - Datasets 2.16.1
96
+ - Tokenizers 0.15.1
97
+
98
+ ## Citation
99
+
100
+ Cite using Bibtex:
101
+
102
+ ```
103
+ @misc {thonburian_whisper_med,
104
+ author = { Atirut Boribalburephan, Zaw Htet Aung, Knot Pipatsrisawat, Titipat Achakulvisut },
105
+ title = { Thonburian Whisper: A fine-tuned Whisper model for Thai automatic speech recognition },
106
+ year = 2022,
107
+ url = { https://huggingface.co/biodatlab/whisper-th-medium-combined },
108
+ doi = { 10.57967/hf/0226 },
109
+ publisher = { Hugging Face }
110
+ }
111
+ ```
added_tokens.json CHANGED
@@ -1,4 +1,1505 @@
1
  {
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  "<|af|>": 50327,
3
  "<|am|>": 50334,
4
  "<|ar|>": 50272,
 
1
  {
2
+ "<|0.00|>": 50364,
3
+ "<|0.02|>": 50365,
4
+ "<|0.04|>": 50366,
5
+ "<|0.06|>": 50367,
6
+ "<|0.08|>": 50368,
7
+ "<|0.10|>": 50369,
8
+ "<|0.12|>": 50370,
9
+ "<|0.14|>": 50371,
10
+ "<|0.16|>": 50372,
11
+ "<|0.18|>": 50373,
12
+ "<|0.20|>": 50374,
13
+ "<|0.22|>": 50375,
14
+ "<|0.24|>": 50376,
15
+ "<|0.26|>": 50377,
16
+ "<|0.28|>": 50378,
17
+ "<|0.30|>": 50379,
18
+ "<|0.32|>": 50380,
19
+ "<|0.34|>": 50381,
20
+ "<|0.36|>": 50382,
21
+ "<|0.38|>": 50383,
22
+ "<|0.40|>": 50384,
23
+ "<|0.42|>": 50385,
24
+ "<|0.44|>": 50386,
25
+ "<|0.46|>": 50387,
26
+ "<|0.48|>": 50388,
27
+ "<|0.50|>": 50389,
28
+ "<|0.52|>": 50390,
29
+ "<|0.54|>": 50391,
30
+ "<|0.56|>": 50392,
31
+ "<|0.58|>": 50393,
32
+ "<|0.60|>": 50394,
33
+ "<|0.62|>": 50395,
34
+ "<|0.64|>": 50396,
35
+ "<|0.66|>": 50397,
36
+ "<|0.68|>": 50398,
37
+ "<|0.70|>": 50399,
38
+ "<|0.72|>": 50400,
39
+ "<|0.74|>": 50401,
40
+ "<|0.76|>": 50402,
41
+ "<|0.78|>": 50403,
42
+ "<|0.80|>": 50404,
43
+ "<|0.82|>": 50405,
44
+ "<|0.84|>": 50406,
45
+ "<|0.86|>": 50407,
46
+ "<|0.88|>": 50408,
47
+ "<|0.90|>": 50409,
48
+ "<|0.92|>": 50410,
49
+ "<|0.94|>": 50411,
50
+ "<|0.96|>": 50412,
51
+ "<|0.98|>": 50413,
52
+ "<|1.00|>": 50414,
53
+ "<|1.02|>": 50415,
54
+ "<|1.04|>": 50416,
55
+ "<|1.06|>": 50417,
56
+ "<|1.08|>": 50418,
57
+ "<|1.10|>": 50419,
58
+ "<|1.12|>": 50420,
59
+ "<|1.14|>": 50421,
60
+ "<|1.16|>": 50422,
61
+ "<|1.18|>": 50423,
62
+ "<|1.20|>": 50424,
63
+ "<|1.22|>": 50425,
64
+ "<|1.24|>": 50426,
65
+ "<|1.26|>": 50427,
66
+ "<|1.28|>": 50428,
67
+ "<|1.30|>": 50429,
68
+ "<|1.32|>": 50430,
69
+ "<|1.34|>": 50431,
70
+ "<|1.36|>": 50432,
71
+ "<|1.38|>": 50433,
72
+ "<|1.40|>": 50434,
73
+ "<|1.42|>": 50435,
74
+ "<|1.44|>": 50436,
75
+ "<|1.46|>": 50437,
76
+ "<|1.48|>": 50438,
77
+ "<|1.50|>": 50439,
78
+ "<|1.52|>": 50440,
79
+ "<|1.54|>": 50441,
80
+ "<|1.56|>": 50442,
81
+ "<|1.58|>": 50443,
82
+ "<|1.60|>": 50444,
83
+ "<|1.62|>": 50445,
84
+ "<|1.64|>": 50446,
85
+ "<|1.66|>": 50447,
86
+ "<|1.68|>": 50448,
87
+ "<|1.70|>": 50449,
88
+ "<|1.72|>": 50450,
89
+ "<|1.74|>": 50451,
90
+ "<|1.76|>": 50452,
91
+ "<|1.78|>": 50453,
92
+ "<|1.80|>": 50454,
93
+ "<|1.82|>": 50455,
94
+ "<|1.84|>": 50456,
95
+ "<|1.86|>": 50457,
96
+ "<|1.88|>": 50458,
97
+ "<|1.90|>": 50459,
98
+ "<|1.92|>": 50460,
99
+ "<|1.94|>": 50461,
100
+ "<|1.96|>": 50462,
101
+ "<|1.98|>": 50463,
102
+ "<|10.00|>": 50864,
103
+ "<|10.02|>": 50865,
104
+ "<|10.04|>": 50866,
105
+ "<|10.06|>": 50867,
106
+ "<|10.08|>": 50868,
107
+ "<|10.10|>": 50869,
108
+ "<|10.12|>": 50870,
109
+ "<|10.14|>": 50871,
110
+ "<|10.16|>": 50872,
111
+ "<|10.18|>": 50873,
112
+ "<|10.20|>": 50874,
113
+ "<|10.22|>": 50875,
114
+ "<|10.24|>": 50876,
115
+ "<|10.26|>": 50877,
116
+ "<|10.28|>": 50878,
117
+ "<|10.30|>": 50879,
118
+ "<|10.32|>": 50880,
119
+ "<|10.34|>": 50881,
120
+ "<|10.36|>": 50882,
121
+ "<|10.38|>": 50883,
122
+ "<|10.40|>": 50884,
123
+ "<|10.42|>": 50885,
124
+ "<|10.44|>": 50886,
125
+ "<|10.46|>": 50887,
126
+ "<|10.48|>": 50888,
127
+ "<|10.50|>": 50889,
128
+ "<|10.52|>": 50890,
129
+ "<|10.54|>": 50891,
130
+ "<|10.56|>": 50892,
131
+ "<|10.58|>": 50893,
132
+ "<|10.60|>": 50894,
133
+ "<|10.62|>": 50895,
134
+ "<|10.64|>": 50896,
135
+ "<|10.66|>": 50897,
136
+ "<|10.68|>": 50898,
137
+ "<|10.70|>": 50899,
138
+ "<|10.72|>": 50900,
139
+ "<|10.74|>": 50901,
140
+ "<|10.76|>": 50902,
141
+ "<|10.78|>": 50903,
142
+ "<|10.80|>": 50904,
143
+ "<|10.82|>": 50905,
144
+ "<|10.84|>": 50906,
145
+ "<|10.86|>": 50907,
146
+ "<|10.88|>": 50908,
147
+ "<|10.90|>": 50909,
148
+ "<|10.92|>": 50910,
149
+ "<|10.94|>": 50911,
150
+ "<|10.96|>": 50912,
151
+ "<|10.98|>": 50913,
152
+ "<|11.00|>": 50914,
153
+ "<|11.02|>": 50915,
154
+ "<|11.04|>": 50916,
155
+ "<|11.06|>": 50917,
156
+ "<|11.08|>": 50918,
157
+ "<|11.10|>": 50919,
158
+ "<|11.12|>": 50920,
159
+ "<|11.14|>": 50921,
160
+ "<|11.16|>": 50922,
161
+ "<|11.18|>": 50923,
162
+ "<|11.20|>": 50924,
163
+ "<|11.22|>": 50925,
164
+ "<|11.24|>": 50926,
165
+ "<|11.26|>": 50927,
166
+ "<|11.28|>": 50928,
167
+ "<|11.30|>": 50929,
168
+ "<|11.32|>": 50930,
169
+ "<|11.34|>": 50931,
170
+ "<|11.36|>": 50932,
171
+ "<|11.38|>": 50933,
172
+ "<|11.40|>": 50934,
173
+ "<|11.42|>": 50935,
174
+ "<|11.44|>": 50936,
175
+ "<|11.46|>": 50937,
176
+ "<|11.48|>": 50938,
177
+ "<|11.50|>": 50939,
178
+ "<|11.52|>": 50940,
179
+ "<|11.54|>": 50941,
180
+ "<|11.56|>": 50942,
181
+ "<|11.58|>": 50943,
182
+ "<|11.60|>": 50944,
183
+ "<|11.62|>": 50945,
184
+ "<|11.64|>": 50946,
185
+ "<|11.66|>": 50947,
186
+ "<|11.68|>": 50948,
187
+ "<|11.70|>": 50949,
188
+ "<|11.72|>": 50950,
189
+ "<|11.74|>": 50951,
190
+ "<|11.76|>": 50952,
191
+ "<|11.78|>": 50953,
192
+ "<|11.80|>": 50954,
193
+ "<|11.82|>": 50955,
194
+ "<|11.84|>": 50956,
195
+ "<|11.86|>": 50957,
196
+ "<|11.88|>": 50958,
197
+ "<|11.90|>": 50959,
198
+ "<|11.92|>": 50960,
199
+ "<|11.94|>": 50961,
200
+ "<|11.96|>": 50962,
201
+ "<|11.98|>": 50963,
202
+ "<|12.00|>": 50964,
203
+ "<|12.02|>": 50965,
204
+ "<|12.04|>": 50966,
205
+ "<|12.06|>": 50967,
206
+ "<|12.08|>": 50968,
207
+ "<|12.10|>": 50969,
208
+ "<|12.12|>": 50970,
209
+ "<|12.14|>": 50971,
210
+ "<|12.16|>": 50972,
211
+ "<|12.18|>": 50973,
212
+ "<|12.20|>": 50974,
213
+ "<|12.22|>": 50975,
214
+ "<|12.24|>": 50976,
215
+ "<|12.26|>": 50977,
216
+ "<|12.28|>": 50978,
217
+ "<|12.30|>": 50979,
218
+ "<|12.32|>": 50980,
219
+ "<|12.34|>": 50981,
220
+ "<|12.36|>": 50982,
221
+ "<|12.38|>": 50983,
222
+ "<|12.40|>": 50984,
223
+ "<|12.42|>": 50985,
224
+ "<|12.44|>": 50986,
225
+ "<|12.46|>": 50987,
226
+ "<|12.48|>": 50988,
227
+ "<|12.50|>": 50989,
228
+ "<|12.52|>": 50990,
229
+ "<|12.54|>": 50991,
230
+ "<|12.56|>": 50992,
231
+ "<|12.58|>": 50993,
232
+ "<|12.60|>": 50994,
233
+ "<|12.62|>": 50995,
234
+ "<|12.64|>": 50996,
235
+ "<|12.66|>": 50997,
236
+ "<|12.68|>": 50998,
237
+ "<|12.70|>": 50999,
238
+ "<|12.72|>": 51000,
239
+ "<|12.74|>": 51001,
240
+ "<|12.76|>": 51002,
241
+ "<|12.78|>": 51003,
242
+ "<|12.80|>": 51004,
243
+ "<|12.82|>": 51005,
244
+ "<|12.84|>": 51006,
245
+ "<|12.86|>": 51007,
246
+ "<|12.88|>": 51008,
247
+ "<|12.90|>": 51009,
248
+ "<|12.92|>": 51010,
249
+ "<|12.94|>": 51011,
250
+ "<|12.96|>": 51012,
251
+ "<|12.98|>": 51013,
252
+ "<|13.00|>": 51014,
253
+ "<|13.02|>": 51015,
254
+ "<|13.04|>": 51016,
255
+ "<|13.06|>": 51017,
256
+ "<|13.08|>": 51018,
257
+ "<|13.10|>": 51019,
258
+ "<|13.12|>": 51020,
259
+ "<|13.14|>": 51021,
260
+ "<|13.16|>": 51022,
261
+ "<|13.18|>": 51023,
262
+ "<|13.20|>": 51024,
263
+ "<|13.22|>": 51025,
264
+ "<|13.24|>": 51026,
265
+ "<|13.26|>": 51027,
266
+ "<|13.28|>": 51028,
267
+ "<|13.30|>": 51029,
268
+ "<|13.32|>": 51030,
269
+ "<|13.34|>": 51031,
270
+ "<|13.36|>": 51032,
271
+ "<|13.38|>": 51033,
272
+ "<|13.40|>": 51034,
273
+ "<|13.42|>": 51035,
274
+ "<|13.44|>": 51036,
275
+ "<|13.46|>": 51037,
276
+ "<|13.48|>": 51038,
277
+ "<|13.50|>": 51039,
278
+ "<|13.52|>": 51040,
279
+ "<|13.54|>": 51041,
280
+ "<|13.56|>": 51042,
281
+ "<|13.58|>": 51043,
282
+ "<|13.60|>": 51044,
283
+ "<|13.62|>": 51045,
284
+ "<|13.64|>": 51046,
285
+ "<|13.66|>": 51047,
286
+ "<|13.68|>": 51048,
287
+ "<|13.70|>": 51049,
288
+ "<|13.72|>": 51050,
289
+ "<|13.74|>": 51051,
290
+ "<|13.76|>": 51052,
291
+ "<|13.78|>": 51053,
292
+ "<|13.80|>": 51054,
293
+ "<|13.82|>": 51055,
294
+ "<|13.84|>": 51056,
295
+ "<|13.86|>": 51057,
296
+ "<|13.88|>": 51058,
297
+ "<|13.90|>": 51059,
298
+ "<|13.92|>": 51060,
299
+ "<|13.94|>": 51061,
300
+ "<|13.96|>": 51062,
301
+ "<|13.98|>": 51063,
302
+ "<|14.00|>": 51064,
303
+ "<|14.02|>": 51065,
304
+ "<|14.04|>": 51066,
305
+ "<|14.06|>": 51067,
306
+ "<|14.08|>": 51068,
307
+ "<|14.10|>": 51069,
308
+ "<|14.12|>": 51070,
309
+ "<|14.14|>": 51071,
310
+ "<|14.16|>": 51072,
311
+ "<|14.18|>": 51073,
312
+ "<|14.20|>": 51074,
313
+ "<|14.22|>": 51075,
314
+ "<|14.24|>": 51076,
315
+ "<|14.26|>": 51077,
316
+ "<|14.28|>": 51078,
317
+ "<|14.30|>": 51079,
318
+ "<|14.32|>": 51080,
319
+ "<|14.34|>": 51081,
320
+ "<|14.36|>": 51082,
321
+ "<|14.38|>": 51083,
322
+ "<|14.40|>": 51084,
323
+ "<|14.42|>": 51085,
324
+ "<|14.44|>": 51086,
325
+ "<|14.46|>": 51087,
326
+ "<|14.48|>": 51088,
327
+ "<|14.50|>": 51089,
328
+ "<|14.52|>": 51090,
329
+ "<|14.54|>": 51091,
330
+ "<|14.56|>": 51092,
331
+ "<|14.58|>": 51093,
332
+ "<|14.60|>": 51094,
333
+ "<|14.62|>": 51095,
334
+ "<|14.64|>": 51096,
335
+ "<|14.66|>": 51097,
336
+ "<|14.68|>": 51098,
337
+ "<|14.70|>": 51099,
338
+ "<|14.72|>": 51100,
339
+ "<|14.74|>": 51101,
340
+ "<|14.76|>": 51102,
341
+ "<|14.78|>": 51103,
342
+ "<|14.80|>": 51104,
343
+ "<|14.82|>": 51105,
344
+ "<|14.84|>": 51106,
345
+ "<|14.86|>": 51107,
346
+ "<|14.88|>": 51108,
347
+ "<|14.90|>": 51109,
348
+ "<|14.92|>": 51110,
349
+ "<|14.94|>": 51111,
350
+ "<|14.96|>": 51112,
351
+ "<|14.98|>": 51113,
352
+ "<|15.00|>": 51114,
353
+ "<|15.02|>": 51115,
354
+ "<|15.04|>": 51116,
355
+ "<|15.06|>": 51117,
356
+ "<|15.08|>": 51118,
357
+ "<|15.10|>": 51119,
358
+ "<|15.12|>": 51120,
359
+ "<|15.14|>": 51121,
360
+ "<|15.16|>": 51122,
361
+ "<|15.18|>": 51123,
362
+ "<|15.20|>": 51124,
363
+ "<|15.22|>": 51125,
364
+ "<|15.24|>": 51126,
365
+ "<|15.26|>": 51127,
366
+ "<|15.28|>": 51128,
367
+ "<|15.30|>": 51129,
368
+ "<|15.32|>": 51130,
369
+ "<|15.34|>": 51131,
370
+ "<|15.36|>": 51132,
371
+ "<|15.38|>": 51133,
372
+ "<|15.40|>": 51134,
373
+ "<|15.42|>": 51135,
374
+ "<|15.44|>": 51136,
375
+ "<|15.46|>": 51137,
376
+ "<|15.48|>": 51138,
377
+ "<|15.50|>": 51139,
378
+ "<|15.52|>": 51140,
379
+ "<|15.54|>": 51141,
380
+ "<|15.56|>": 51142,
381
+ "<|15.58|>": 51143,
382
+ "<|15.60|>": 51144,
383
+ "<|15.62|>": 51145,
384
+ "<|15.64|>": 51146,
385
+ "<|15.66|>": 51147,
386
+ "<|15.68|>": 51148,
387
+ "<|15.70|>": 51149,
388
+ "<|15.72|>": 51150,
389
+ "<|15.74|>": 51151,
390
+ "<|15.76|>": 51152,
391
+ "<|15.78|>": 51153,
392
+ "<|15.80|>": 51154,
393
+ "<|15.82|>": 51155,
394
+ "<|15.84|>": 51156,
395
+ "<|15.86|>": 51157,
396
+ "<|15.88|>": 51158,
397
+ "<|15.90|>": 51159,
398
+ "<|15.92|>": 51160,
399
+ "<|15.94|>": 51161,
400
+ "<|15.96|>": 51162,
401
+ "<|15.98|>": 51163,
402
+ "<|16.00|>": 51164,
403
+ "<|16.02|>": 51165,
404
+ "<|16.04|>": 51166,
405
+ "<|16.06|>": 51167,
406
+ "<|16.08|>": 51168,
407
+ "<|16.10|>": 51169,
408
+ "<|16.12|>": 51170,
409
+ "<|16.14|>": 51171,
410
+ "<|16.16|>": 51172,
411
+ "<|16.18|>": 51173,
412
+ "<|16.20|>": 51174,
413
+ "<|16.22|>": 51175,
414
+ "<|16.24|>": 51176,
415
+ "<|16.26|>": 51177,
416
+ "<|16.28|>": 51178,
417
+ "<|16.30|>": 51179,
418
+ "<|16.32|>": 51180,
419
+ "<|16.34|>": 51181,
420
+ "<|16.36|>": 51182,
421
+ "<|16.38|>": 51183,
422
+ "<|16.40|>": 51184,
423
+ "<|16.42|>": 51185,
424
+ "<|16.44|>": 51186,
425
+ "<|16.46|>": 51187,
426
+ "<|16.48|>": 51188,
427
+ "<|16.50|>": 51189,
428
+ "<|16.52|>": 51190,
429
+ "<|16.54|>": 51191,
430
+ "<|16.56|>": 51192,
431
+ "<|16.58|>": 51193,
432
+ "<|16.60|>": 51194,
433
+ "<|16.62|>": 51195,
434
+ "<|16.64|>": 51196,
435
+ "<|16.66|>": 51197,
436
+ "<|16.68|>": 51198,
437
+ "<|16.70|>": 51199,
438
+ "<|16.72|>": 51200,
439
+ "<|16.74|>": 51201,
440
+ "<|16.76|>": 51202,
441
+ "<|16.78|>": 51203,
442
+ "<|16.80|>": 51204,
443
+ "<|16.82|>": 51205,
444
+ "<|16.84|>": 51206,
445
+ "<|16.86|>": 51207,
446
+ "<|16.88|>": 51208,
447
+ "<|16.90|>": 51209,
448
+ "<|16.92|>": 51210,
449
+ "<|16.94|>": 51211,
450
+ "<|16.96|>": 51212,
451
+ "<|16.98|>": 51213,
452
+ "<|17.00|>": 51214,
453
+ "<|17.02|>": 51215,
454
+ "<|17.04|>": 51216,
455
+ "<|17.06|>": 51217,
456
+ "<|17.08|>": 51218,
457
+ "<|17.10|>": 51219,
458
+ "<|17.12|>": 51220,
459
+ "<|17.14|>": 51221,
460
+ "<|17.16|>": 51222,
461
+ "<|17.18|>": 51223,
462
+ "<|17.20|>": 51224,
463
+ "<|17.22|>": 51225,
464
+ "<|17.24|>": 51226,
465
+ "<|17.26|>": 51227,
466
+ "<|17.28|>": 51228,
467
+ "<|17.30|>": 51229,
468
+ "<|17.32|>": 51230,
469
+ "<|17.34|>": 51231,
470
+ "<|17.36|>": 51232,
471
+ "<|17.38|>": 51233,
472
+ "<|17.40|>": 51234,
473
+ "<|17.42|>": 51235,
474
+ "<|17.44|>": 51236,
475
+ "<|17.46|>": 51237,
476
+ "<|17.48|>": 51238,
477
+ "<|17.50|>": 51239,
478
+ "<|17.52|>": 51240,
479
+ "<|17.54|>": 51241,
480
+ "<|17.56|>": 51242,
481
+ "<|17.58|>": 51243,
482
+ "<|17.60|>": 51244,
483
+ "<|17.62|>": 51245,
484
+ "<|17.64|>": 51246,
485
+ "<|17.66|>": 51247,
486
+ "<|17.68|>": 51248,
487
+ "<|17.70|>": 51249,
488
+ "<|17.72|>": 51250,
489
+ "<|17.74|>": 51251,
490
+ "<|17.76|>": 51252,
491
+ "<|17.78|>": 51253,
492
+ "<|17.80|>": 51254,
493
+ "<|17.82|>": 51255,
494
+ "<|17.84|>": 51256,
495
+ "<|17.86|>": 51257,
496
+ "<|17.88|>": 51258,
497
+ "<|17.90|>": 51259,
498
+ "<|17.92|>": 51260,
499
+ "<|17.94|>": 51261,
500
+ "<|17.96|>": 51262,
501
+ "<|17.98|>": 51263,
502
+ "<|18.00|>": 51264,
503
+ "<|18.02|>": 51265,
504
+ "<|18.04|>": 51266,
505
+ "<|18.06|>": 51267,
506
+ "<|18.08|>": 51268,
507
+ "<|18.10|>": 51269,
508
+ "<|18.12|>": 51270,
509
+ "<|18.14|>": 51271,
510
+ "<|18.16|>": 51272,
511
+ "<|18.18|>": 51273,
512
+ "<|18.20|>": 51274,
513
+ "<|18.22|>": 51275,
514
+ "<|18.24|>": 51276,
515
+ "<|18.26|>": 51277,
516
+ "<|18.28|>": 51278,
517
+ "<|18.30|>": 51279,
518
+ "<|18.32|>": 51280,
519
+ "<|18.34|>": 51281,
520
+ "<|18.36|>": 51282,
521
+ "<|18.38|>": 51283,
522
+ "<|18.40|>": 51284,
523
+ "<|18.42|>": 51285,
524
+ "<|18.44|>": 51286,
525
+ "<|18.46|>": 51287,
526
+ "<|18.48|>": 51288,
527
+ "<|18.50|>": 51289,
528
+ "<|18.52|>": 51290,
529
+ "<|18.54|>": 51291,
530
+ "<|18.56|>": 51292,
531
+ "<|18.58|>": 51293,
532
+ "<|18.60|>": 51294,
533
+ "<|18.62|>": 51295,
534
+ "<|18.64|>": 51296,
535
+ "<|18.66|>": 51297,
536
+ "<|18.68|>": 51298,
537
+ "<|18.70|>": 51299,
538
+ "<|18.72|>": 51300,
539
+ "<|18.74|>": 51301,
540
+ "<|18.76|>": 51302,
541
+ "<|18.78|>": 51303,
542
+ "<|18.80|>": 51304,
543
+ "<|18.82|>": 51305,
544
+ "<|18.84|>": 51306,
545
+ "<|18.86|>": 51307,
546
+ "<|18.88|>": 51308,
547
+ "<|18.90|>": 51309,
548
+ "<|18.92|>": 51310,
549
+ "<|18.94|>": 51311,
550
+ "<|18.96|>": 51312,
551
+ "<|18.98|>": 51313,
552
+ "<|19.00|>": 51314,
553
+ "<|19.02|>": 51315,
554
+ "<|19.04|>": 51316,
555
+ "<|19.06|>": 51317,
556
+ "<|19.08|>": 51318,
557
+ "<|19.10|>": 51319,
558
+ "<|19.12|>": 51320,
559
+ "<|19.14|>": 51321,
560
+ "<|19.16|>": 51322,
561
+ "<|19.18|>": 51323,
562
+ "<|19.20|>": 51324,
563
+ "<|19.22|>": 51325,
564
+ "<|19.24|>": 51326,
565
+ "<|19.26|>": 51327,
566
+ "<|19.28|>": 51328,
567
+ "<|19.30|>": 51329,
568
+ "<|19.32|>": 51330,
569
+ "<|19.34|>": 51331,
570
+ "<|19.36|>": 51332,
571
+ "<|19.38|>": 51333,
572
+ "<|19.40|>": 51334,
573
+ "<|19.42|>": 51335,
574
+ "<|19.44|>": 51336,
575
+ "<|19.46|>": 51337,
576
+ "<|19.48|>": 51338,
577
+ "<|19.50|>": 51339,
578
+ "<|19.52|>": 51340,
579
+ "<|19.54|>": 51341,
580
+ "<|19.56|>": 51342,
581
+ "<|19.58|>": 51343,
582
+ "<|19.60|>": 51344,
583
+ "<|19.62|>": 51345,
584
+ "<|19.64|>": 51346,
585
+ "<|19.66|>": 51347,
586
+ "<|19.68|>": 51348,
587
+ "<|19.70|>": 51349,
588
+ "<|19.72|>": 51350,
589
+ "<|19.74|>": 51351,
590
+ "<|19.76|>": 51352,
591
+ "<|19.78|>": 51353,
592
+ "<|19.80|>": 51354,
593
+ "<|19.82|>": 51355,
594
+ "<|19.84|>": 51356,
595
+ "<|19.86|>": 51357,
596
+ "<|19.88|>": 51358,
597
+ "<|19.90|>": 51359,
598
+ "<|19.92|>": 51360,
599
+ "<|19.94|>": 51361,
600
+ "<|19.96|>": 51362,
601
+ "<|19.98|>": 51363,
602
+ "<|2.00|>": 50464,
603
+ "<|2.02|>": 50465,
604
+ "<|2.04|>": 50466,
605
+ "<|2.06|>": 50467,
606
+ "<|2.08|>": 50468,
607
+ "<|2.10|>": 50469,
608
+ "<|2.12|>": 50470,
609
+ "<|2.14|>": 50471,
610
+ "<|2.16|>": 50472,
611
+ "<|2.18|>": 50473,
612
+ "<|2.20|>": 50474,
613
+ "<|2.22|>": 50475,
614
+ "<|2.24|>": 50476,
615
+ "<|2.26|>": 50477,
616
+ "<|2.28|>": 50478,
617
+ "<|2.30|>": 50479,
618
+ "<|2.32|>": 50480,
619
+ "<|2.34|>": 50481,
620
+ "<|2.36|>": 50482,
621
+ "<|2.38|>": 50483,
622
+ "<|2.40|>": 50484,
623
+ "<|2.42|>": 50485,
624
+ "<|2.44|>": 50486,
625
+ "<|2.46|>": 50487,
626
+ "<|2.48|>": 50488,
627
+ "<|2.50|>": 50489,
628
+ "<|2.52|>": 50490,
629
+ "<|2.54|>": 50491,
630
+ "<|2.56|>": 50492,
631
+ "<|2.58|>": 50493,
632
+ "<|2.60|>": 50494,
633
+ "<|2.62|>": 50495,
634
+ "<|2.64|>": 50496,
635
+ "<|2.66|>": 50497,
636
+ "<|2.68|>": 50498,
637
+ "<|2.70|>": 50499,
638
+ "<|2.72|>": 50500,
639
+ "<|2.74|>": 50501,
640
+ "<|2.76|>": 50502,
641
+ "<|2.78|>": 50503,
642
+ "<|2.80|>": 50504,
643
+ "<|2.82|>": 50505,
644
+ "<|2.84|>": 50506,
645
+ "<|2.86|>": 50507,
646
+ "<|2.88|>": 50508,
647
+ "<|2.90|>": 50509,
648
+ "<|2.92|>": 50510,
649
+ "<|2.94|>": 50511,
650
+ "<|2.96|>": 50512,
651
+ "<|2.98|>": 50513,
652
+ "<|20.00|>": 51364,
653
+ "<|20.02|>": 51365,
654
+ "<|20.04|>": 51366,
655
+ "<|20.06|>": 51367,
656
+ "<|20.08|>": 51368,
657
+ "<|20.10|>": 51369,
658
+ "<|20.12|>": 51370,
659
+ "<|20.14|>": 51371,
660
+ "<|20.16|>": 51372,
661
+ "<|20.18|>": 51373,
662
+ "<|20.20|>": 51374,
663
+ "<|20.22|>": 51375,
664
+ "<|20.24|>": 51376,
665
+ "<|20.26|>": 51377,
666
+ "<|20.28|>": 51378,
667
+ "<|20.30|>": 51379,
668
+ "<|20.32|>": 51380,
669
+ "<|20.34|>": 51381,
670
+ "<|20.36|>": 51382,
671
+ "<|20.38|>": 51383,
672
+ "<|20.40|>": 51384,
673
+ "<|20.42|>": 51385,
674
+ "<|20.44|>": 51386,
675
+ "<|20.46|>": 51387,
676
+ "<|20.48|>": 51388,
677
+ "<|20.50|>": 51389,
678
+ "<|20.52|>": 51390,
679
+ "<|20.54|>": 51391,
680
+ "<|20.56|>": 51392,
681
+ "<|20.58|>": 51393,
682
+ "<|20.60|>": 51394,
683
+ "<|20.62|>": 51395,
684
+ "<|20.64|>": 51396,
685
+ "<|20.66|>": 51397,
686
+ "<|20.68|>": 51398,
687
+ "<|20.70|>": 51399,
688
+ "<|20.72|>": 51400,
689
+ "<|20.74|>": 51401,
690
+ "<|20.76|>": 51402,
691
+ "<|20.78|>": 51403,
692
+ "<|20.80|>": 51404,
693
+ "<|20.82|>": 51405,
694
+ "<|20.84|>": 51406,
695
+ "<|20.86|>": 51407,
696
+ "<|20.88|>": 51408,
697
+ "<|20.90|>": 51409,
698
+ "<|20.92|>": 51410,
699
+ "<|20.94|>": 51411,
700
+ "<|20.96|>": 51412,
701
+ "<|20.98|>": 51413,
702
+ "<|21.00|>": 51414,
703
+ "<|21.02|>": 51415,
704
+ "<|21.04|>": 51416,
705
+ "<|21.06|>": 51417,
706
+ "<|21.08|>": 51418,
707
+ "<|21.10|>": 51419,
708
+ "<|21.12|>": 51420,
709
+ "<|21.14|>": 51421,
710
+ "<|21.16|>": 51422,
711
+ "<|21.18|>": 51423,
712
+ "<|21.20|>": 51424,
713
+ "<|21.22|>": 51425,
714
+ "<|21.24|>": 51426,
715
+ "<|21.26|>": 51427,
716
+ "<|21.28|>": 51428,
717
+ "<|21.30|>": 51429,
718
+ "<|21.32|>": 51430,
719
+ "<|21.34|>": 51431,
720
+ "<|21.36|>": 51432,
721
+ "<|21.38|>": 51433,
722
+ "<|21.40|>": 51434,
723
+ "<|21.42|>": 51435,
724
+ "<|21.44|>": 51436,
725
+ "<|21.46|>": 51437,
726
+ "<|21.48|>": 51438,
727
+ "<|21.50|>": 51439,
728
+ "<|21.52|>": 51440,
729
+ "<|21.54|>": 51441,
730
+ "<|21.56|>": 51442,
731
+ "<|21.58|>": 51443,
732
+ "<|21.60|>": 51444,
733
+ "<|21.62|>": 51445,
734
+ "<|21.64|>": 51446,
735
+ "<|21.66|>": 51447,
736
+ "<|21.68|>": 51448,
737
+ "<|21.70|>": 51449,
738
+ "<|21.72|>": 51450,
739
+ "<|21.74|>": 51451,
740
+ "<|21.76|>": 51452,
741
+ "<|21.78|>": 51453,
742
+ "<|21.80|>": 51454,
743
+ "<|21.82|>": 51455,
744
+ "<|21.84|>": 51456,
745
+ "<|21.86|>": 51457,
746
+ "<|21.88|>": 51458,
747
+ "<|21.90|>": 51459,
748
+ "<|21.92|>": 51460,
749
+ "<|21.94|>": 51461,
750
+ "<|21.96|>": 51462,
751
+ "<|21.98|>": 51463,
752
+ "<|22.00|>": 51464,
753
+ "<|22.02|>": 51465,
754
+ "<|22.04|>": 51466,
755
+ "<|22.06|>": 51467,
756
+ "<|22.08|>": 51468,
757
+ "<|22.10|>": 51469,
758
+ "<|22.12|>": 51470,
759
+ "<|22.14|>": 51471,
760
+ "<|22.16|>": 51472,
761
+ "<|22.18|>": 51473,
762
+ "<|22.20|>": 51474,
763
+ "<|22.22|>": 51475,
764
+ "<|22.24|>": 51476,
765
+ "<|22.26|>": 51477,
766
+ "<|22.28|>": 51478,
767
+ "<|22.30|>": 51479,
768
+ "<|22.32|>": 51480,
769
+ "<|22.34|>": 51481,
770
+ "<|22.36|>": 51482,
771
+ "<|22.38|>": 51483,
772
+ "<|22.40|>": 51484,
773
+ "<|22.42|>": 51485,
774
+ "<|22.44|>": 51486,
775
+ "<|22.46|>": 51487,
776
+ "<|22.48|>": 51488,
777
+ "<|22.50|>": 51489,
778
+ "<|22.52|>": 51490,
779
+ "<|22.54|>": 51491,
780
+ "<|22.56|>": 51492,
781
+ "<|22.58|>": 51493,
782
+ "<|22.60|>": 51494,
783
+ "<|22.62|>": 51495,
784
+ "<|22.64|>": 51496,
785
+ "<|22.66|>": 51497,
786
+ "<|22.68|>": 51498,
787
+ "<|22.70|>": 51499,
788
+ "<|22.72|>": 51500,
789
+ "<|22.74|>": 51501,
790
+ "<|22.76|>": 51502,
791
+ "<|22.78|>": 51503,
792
+ "<|22.80|>": 51504,
793
+ "<|22.82|>": 51505,
794
+ "<|22.84|>": 51506,
795
+ "<|22.86|>": 51507,
796
+ "<|22.88|>": 51508,
797
+ "<|22.90|>": 51509,
798
+ "<|22.92|>": 51510,
799
+ "<|22.94|>": 51511,
800
+ "<|22.96|>": 51512,
801
+ "<|22.98|>": 51513,
802
+ "<|23.00|>": 51514,
803
+ "<|23.02|>": 51515,
804
+ "<|23.04|>": 51516,
805
+ "<|23.06|>": 51517,
806
+ "<|23.08|>": 51518,
807
+ "<|23.10|>": 51519,
808
+ "<|23.12|>": 51520,
809
+ "<|23.14|>": 51521,
810
+ "<|23.16|>": 51522,
811
+ "<|23.18|>": 51523,
812
+ "<|23.20|>": 51524,
813
+ "<|23.22|>": 51525,
814
+ "<|23.24|>": 51526,
815
+ "<|23.26|>": 51527,
816
+ "<|23.28|>": 51528,
817
+ "<|23.30|>": 51529,
818
+ "<|23.32|>": 51530,
819
+ "<|23.34|>": 51531,
820
+ "<|23.36|>": 51532,
821
+ "<|23.38|>": 51533,
822
+ "<|23.40|>": 51534,
823
+ "<|23.42|>": 51535,
824
+ "<|23.44|>": 51536,
825
+ "<|23.46|>": 51537,
826
+ "<|23.48|>": 51538,
827
+ "<|23.50|>": 51539,
828
+ "<|23.52|>": 51540,
829
+ "<|23.54|>": 51541,
830
+ "<|23.56|>": 51542,
831
+ "<|23.58|>": 51543,
832
+ "<|23.60|>": 51544,
833
+ "<|23.62|>": 51545,
834
+ "<|23.64|>": 51546,
835
+ "<|23.66|>": 51547,
836
+ "<|23.68|>": 51548,
837
+ "<|23.70|>": 51549,
838
+ "<|23.72|>": 51550,
839
+ "<|23.74|>": 51551,
840
+ "<|23.76|>": 51552,
841
+ "<|23.78|>": 51553,
842
+ "<|23.80|>": 51554,
843
+ "<|23.82|>": 51555,
844
+ "<|23.84|>": 51556,
845
+ "<|23.86|>": 51557,
846
+ "<|23.88|>": 51558,
847
+ "<|23.90|>": 51559,
848
+ "<|23.92|>": 51560,
849
+ "<|23.94|>": 51561,
850
+ "<|23.96|>": 51562,
851
+ "<|23.98|>": 51563,
852
+ "<|24.00|>": 51564,
853
+ "<|24.02|>": 51565,
854
+ "<|24.04|>": 51566,
855
+ "<|24.06|>": 51567,
856
+ "<|24.08|>": 51568,
857
+ "<|24.10|>": 51569,
858
+ "<|24.12|>": 51570,
859
+ "<|24.14|>": 51571,
860
+ "<|24.16|>": 51572,
861
+ "<|24.18|>": 51573,
862
+ "<|24.20|>": 51574,
863
+ "<|24.22|>": 51575,
864
+ "<|24.24|>": 51576,
865
+ "<|24.26|>": 51577,
866
+ "<|24.28|>": 51578,
867
+ "<|24.30|>": 51579,
868
+ "<|24.32|>": 51580,
869
+ "<|24.34|>": 51581,
870
+ "<|24.36|>": 51582,
871
+ "<|24.38|>": 51583,
872
+ "<|24.40|>": 51584,
873
+ "<|24.42|>": 51585,
874
+ "<|24.44|>": 51586,
875
+ "<|24.46|>": 51587,
876
+ "<|24.48|>": 51588,
877
+ "<|24.50|>": 51589,
878
+ "<|24.52|>": 51590,
879
+ "<|24.54|>": 51591,
880
+ "<|24.56|>": 51592,
881
+ "<|24.58|>": 51593,
882
+ "<|24.60|>": 51594,
883
+ "<|24.62|>": 51595,
884
+ "<|24.64|>": 51596,
885
+ "<|24.66|>": 51597,
886
+ "<|24.68|>": 51598,
887
+ "<|24.70|>": 51599,
888
+ "<|24.72|>": 51600,
889
+ "<|24.74|>": 51601,
890
+ "<|24.76|>": 51602,
891
+ "<|24.78|>": 51603,
892
+ "<|24.80|>": 51604,
893
+ "<|24.82|>": 51605,
894
+ "<|24.84|>": 51606,
895
+ "<|24.86|>": 51607,
896
+ "<|24.88|>": 51608,
897
+ "<|24.90|>": 51609,
898
+ "<|24.92|>": 51610,
899
+ "<|24.94|>": 51611,
900
+ "<|24.96|>": 51612,
901
+ "<|24.98|>": 51613,
902
+ "<|25.00|>": 51614,
903
+ "<|25.02|>": 51615,
904
+ "<|25.04|>": 51616,
905
+ "<|25.06|>": 51617,
906
+ "<|25.08|>": 51618,
907
+ "<|25.10|>": 51619,
908
+ "<|25.12|>": 51620,
909
+ "<|25.14|>": 51621,
910
+ "<|25.16|>": 51622,
911
+ "<|25.18|>": 51623,
912
+ "<|25.20|>": 51624,
913
+ "<|25.22|>": 51625,
914
+ "<|25.24|>": 51626,
915
+ "<|25.26|>": 51627,
916
+ "<|25.28|>": 51628,
917
+ "<|25.30|>": 51629,
918
+ "<|25.32|>": 51630,
919
+ "<|25.34|>": 51631,
920
+ "<|25.36|>": 51632,
921
+ "<|25.38|>": 51633,
922
+ "<|25.40|>": 51634,
923
+ "<|25.42|>": 51635,
924
+ "<|25.44|>": 51636,
925
+ "<|25.46|>": 51637,
926
+ "<|25.48|>": 51638,
927
+ "<|25.50|>": 51639,
928
+ "<|25.52|>": 51640,
929
+ "<|25.54|>": 51641,
930
+ "<|25.56|>": 51642,
931
+ "<|25.58|>": 51643,
932
+ "<|25.60|>": 51644,
933
+ "<|25.62|>": 51645,
934
+ "<|25.64|>": 51646,
935
+ "<|25.66|>": 51647,
936
+ "<|25.68|>": 51648,
937
+ "<|25.70|>": 51649,
938
+ "<|25.72|>": 51650,
939
+ "<|25.74|>": 51651,
940
+ "<|25.76|>": 51652,
941
+ "<|25.78|>": 51653,
942
+ "<|25.80|>": 51654,
943
+ "<|25.82|>": 51655,
944
+ "<|25.84|>": 51656,
945
+ "<|25.86|>": 51657,
946
+ "<|25.88|>": 51658,
947
+ "<|25.90|>": 51659,
948
+ "<|25.92|>": 51660,
949
+ "<|25.94|>": 51661,
950
+ "<|25.96|>": 51662,
951
+ "<|25.98|>": 51663,
952
+ "<|26.00|>": 51664,
953
+ "<|26.02|>": 51665,
954
+ "<|26.04|>": 51666,
955
+ "<|26.06|>": 51667,
956
+ "<|26.08|>": 51668,
957
+ "<|26.10|>": 51669,
958
+ "<|26.12|>": 51670,
959
+ "<|26.14|>": 51671,
960
+ "<|26.16|>": 51672,
961
+ "<|26.18|>": 51673,
962
+ "<|26.20|>": 51674,
963
+ "<|26.22|>": 51675,
964
+ "<|26.24|>": 51676,
965
+ "<|26.26|>": 51677,
966
+ "<|26.28|>": 51678,
967
+ "<|26.30|>": 51679,
968
+ "<|26.32|>": 51680,
969
+ "<|26.34|>": 51681,
970
+ "<|26.36|>": 51682,
971
+ "<|26.38|>": 51683,
972
+ "<|26.40|>": 51684,
973
+ "<|26.42|>": 51685,
974
+ "<|26.44|>": 51686,
975
+ "<|26.46|>": 51687,
976
+ "<|26.48|>": 51688,
977
+ "<|26.50|>": 51689,
978
+ "<|26.52|>": 51690,
979
+ "<|26.54|>": 51691,
980
+ "<|26.56|>": 51692,
981
+ "<|26.58|>": 51693,
982
+ "<|26.60|>": 51694,
983
+ "<|26.62|>": 51695,
984
+ "<|26.64|>": 51696,
985
+ "<|26.66|>": 51697,
986
+ "<|26.68|>": 51698,
987
+ "<|26.70|>": 51699,
988
+ "<|26.72|>": 51700,
989
+ "<|26.74|>": 51701,
990
+ "<|26.76|>": 51702,
991
+ "<|26.78|>": 51703,
992
+ "<|26.80|>": 51704,
993
+ "<|26.82|>": 51705,
994
+ "<|26.84|>": 51706,
995
+ "<|26.86|>": 51707,
996
+ "<|26.88|>": 51708,
997
+ "<|26.90|>": 51709,
998
+ "<|26.92|>": 51710,
999
+ "<|26.94|>": 51711,
1000
+ "<|26.96|>": 51712,
1001
+ "<|26.98|>": 51713,
1002
+ "<|27.00|>": 51714,
1003
+ "<|27.02|>": 51715,
1004
+ "<|27.04|>": 51716,
1005
+ "<|27.06|>": 51717,
1006
+ "<|27.08|>": 51718,
1007
+ "<|27.10|>": 51719,
1008
+ "<|27.12|>": 51720,
1009
+ "<|27.14|>": 51721,
1010
+ "<|27.16|>": 51722,
1011
+ "<|27.18|>": 51723,
1012
+ "<|27.20|>": 51724,
1013
+ "<|27.22|>": 51725,
1014
+ "<|27.24|>": 51726,
1015
+ "<|27.26|>": 51727,
1016
+ "<|27.28|>": 51728,
1017
+ "<|27.30|>": 51729,
1018
+ "<|27.32|>": 51730,
1019
+ "<|27.34|>": 51731,
1020
+ "<|27.36|>": 51732,
1021
+ "<|27.38|>": 51733,
1022
+ "<|27.40|>": 51734,
1023
+ "<|27.42|>": 51735,
1024
+ "<|27.44|>": 51736,
1025
+ "<|27.46|>": 51737,
1026
+ "<|27.48|>": 51738,
1027
+ "<|27.50|>": 51739,
1028
+ "<|27.52|>": 51740,
1029
+ "<|27.54|>": 51741,
1030
+ "<|27.56|>": 51742,
1031
+ "<|27.58|>": 51743,
1032
+ "<|27.60|>": 51744,
1033
+ "<|27.62|>": 51745,
1034
+ "<|27.64|>": 51746,
1035
+ "<|27.66|>": 51747,
1036
+ "<|27.68|>": 51748,
1037
+ "<|27.70|>": 51749,
1038
+ "<|27.72|>": 51750,
1039
+ "<|27.74|>": 51751,
1040
+ "<|27.76|>": 51752,
1041
+ "<|27.78|>": 51753,
1042
+ "<|27.80|>": 51754,
1043
+ "<|27.82|>": 51755,
1044
+ "<|27.84|>": 51756,
1045
+ "<|27.86|>": 51757,
1046
+ "<|27.88|>": 51758,
1047
+ "<|27.90|>": 51759,
1048
+ "<|27.92|>": 51760,
1049
+ "<|27.94|>": 51761,
1050
+ "<|27.96|>": 51762,
1051
+ "<|27.98|>": 51763,
1052
+ "<|28.00|>": 51764,
1053
+ "<|28.02|>": 51765,
1054
+ "<|28.04|>": 51766,
1055
+ "<|28.06|>": 51767,
1056
+ "<|28.08|>": 51768,
1057
+ "<|28.10|>": 51769,
1058
+ "<|28.12|>": 51770,
1059
+ "<|28.14|>": 51771,
1060
+ "<|28.16|>": 51772,
1061
+ "<|28.18|>": 51773,
1062
+ "<|28.20|>": 51774,
1063
+ "<|28.22|>": 51775,
1064
+ "<|28.24|>": 51776,
1065
+ "<|28.26|>": 51777,
1066
+ "<|28.28|>": 51778,
1067
+ "<|28.30|>": 51779,
1068
+ "<|28.32|>": 51780,
1069
+ "<|28.34|>": 51781,
1070
+ "<|28.36|>": 51782,
1071
+ "<|28.38|>": 51783,
1072
+ "<|28.40|>": 51784,
1073
+ "<|28.42|>": 51785,
1074
+ "<|28.44|>": 51786,
1075
+ "<|28.46|>": 51787,
1076
+ "<|28.48|>": 51788,
1077
+ "<|28.50|>": 51789,
1078
+ "<|28.52|>": 51790,
1079
+ "<|28.54|>": 51791,
1080
+ "<|28.56|>": 51792,
1081
+ "<|28.58|>": 51793,
1082
+ "<|28.60|>": 51794,
1083
+ "<|28.62|>": 51795,
1084
+ "<|28.64|>": 51796,
1085
+ "<|28.66|>": 51797,
1086
+ "<|28.68|>": 51798,
1087
+ "<|28.70|>": 51799,
1088
+ "<|28.72|>": 51800,
1089
+ "<|28.74|>": 51801,
1090
+ "<|28.76|>": 51802,
1091
+ "<|28.78|>": 51803,
1092
+ "<|28.80|>": 51804,
1093
+ "<|28.82|>": 51805,
1094
+ "<|28.84|>": 51806,
1095
+ "<|28.86|>": 51807,
1096
+ "<|28.88|>": 51808,
1097
+ "<|28.90|>": 51809,
1098
+ "<|28.92|>": 51810,
1099
+ "<|28.94|>": 51811,
1100
+ "<|28.96|>": 51812,
1101
+ "<|28.98|>": 51813,
1102
+ "<|29.00|>": 51814,
1103
+ "<|29.02|>": 51815,
1104
+ "<|29.04|>": 51816,
1105
+ "<|29.06|>": 51817,
1106
+ "<|29.08|>": 51818,
1107
+ "<|29.10|>": 51819,
1108
+ "<|29.12|>": 51820,
1109
+ "<|29.14|>": 51821,
1110
+ "<|29.16|>": 51822,
1111
+ "<|29.18|>": 51823,
1112
+ "<|29.20|>": 51824,
1113
+ "<|29.22|>": 51825,
1114
+ "<|29.24|>": 51826,
1115
+ "<|29.26|>": 51827,
1116
+ "<|29.28|>": 51828,
1117
+ "<|29.30|>": 51829,
1118
+ "<|29.32|>": 51830,
1119
+ "<|29.34|>": 51831,
1120
+ "<|29.36|>": 51832,
1121
+ "<|29.38|>": 51833,
1122
+ "<|29.40|>": 51834,
1123
+ "<|29.42|>": 51835,
1124
+ "<|29.44|>": 51836,
1125
+ "<|29.46|>": 51837,
1126
+ "<|29.48|>": 51838,
1127
+ "<|29.50|>": 51839,
1128
+ "<|29.52|>": 51840,
1129
+ "<|29.54|>": 51841,
1130
+ "<|29.56|>": 51842,
1131
+ "<|29.58|>": 51843,
1132
+ "<|29.60|>": 51844,
1133
+ "<|29.62|>": 51845,
1134
+ "<|29.64|>": 51846,
1135
+ "<|29.66|>": 51847,
1136
+ "<|29.68|>": 51848,
1137
+ "<|29.70|>": 51849,
1138
+ "<|29.72|>": 51850,
1139
+ "<|29.74|>": 51851,
1140
+ "<|29.76|>": 51852,
1141
+ "<|29.78|>": 51853,
1142
+ "<|29.80|>": 51854,
1143
+ "<|29.82|>": 51855,
1144
+ "<|29.84|>": 51856,
1145
+ "<|29.86|>": 51857,
1146
+ "<|29.88|>": 51858,
1147
+ "<|29.90|>": 51859,
1148
+ "<|29.92|>": 51860,
1149
+ "<|29.94|>": 51861,
1150
+ "<|29.96|>": 51862,
1151
+ "<|29.98|>": 51863,
1152
+ "<|3.00|>": 50514,
1153
+ "<|3.02|>": 50515,
1154
+ "<|3.04|>": 50516,
1155
+ "<|3.06|>": 50517,
1156
+ "<|3.08|>": 50518,
1157
+ "<|3.10|>": 50519,
1158
+ "<|3.12|>": 50520,
1159
+ "<|3.14|>": 50521,
1160
+ "<|3.16|>": 50522,
1161
+ "<|3.18|>": 50523,
1162
+ "<|3.20|>": 50524,
1163
+ "<|3.22|>": 50525,
1164
+ "<|3.24|>": 50526,
1165
+ "<|3.26|>": 50527,
1166
+ "<|3.28|>": 50528,
1167
+ "<|3.30|>": 50529,
1168
+ "<|3.32|>": 50530,
1169
+ "<|3.34|>": 50531,
1170
+ "<|3.36|>": 50532,
1171
+ "<|3.38|>": 50533,
1172
+ "<|3.40|>": 50534,
1173
+ "<|3.42|>": 50535,
1174
+ "<|3.44|>": 50536,
1175
+ "<|3.46|>": 50537,
1176
+ "<|3.48|>": 50538,
1177
+ "<|3.50|>": 50539,
1178
+ "<|3.52|>": 50540,
1179
+ "<|3.54|>": 50541,
1180
+ "<|3.56|>": 50542,
1181
+ "<|3.58|>": 50543,
1182
+ "<|3.60|>": 50544,
1183
+ "<|3.62|>": 50545,
1184
+ "<|3.64|>": 50546,
1185
+ "<|3.66|>": 50547,
1186
+ "<|3.68|>": 50548,
1187
+ "<|3.70|>": 50549,
1188
+ "<|3.72|>": 50550,
1189
+ "<|3.74|>": 50551,
1190
+ "<|3.76|>": 50552,
1191
+ "<|3.78|>": 50553,
1192
+ "<|3.80|>": 50554,
1193
+ "<|3.82|>": 50555,
1194
+ "<|3.84|>": 50556,
1195
+ "<|3.86|>": 50557,
1196
+ "<|3.88|>": 50558,
1197
+ "<|3.90|>": 50559,
1198
+ "<|3.92|>": 50560,
1199
+ "<|3.94|>": 50561,
1200
+ "<|3.96|>": 50562,
1201
+ "<|3.98|>": 50563,
1202
+ "<|30.00|>": 51864,
1203
+ "<|4.00|>": 50564,
1204
+ "<|4.02|>": 50565,
1205
+ "<|4.04|>": 50566,
1206
+ "<|4.06|>": 50567,
1207
+ "<|4.08|>": 50568,
1208
+ "<|4.10|>": 50569,
1209
+ "<|4.12|>": 50570,
1210
+ "<|4.14|>": 50571,
1211
+ "<|4.16|>": 50572,
1212
+ "<|4.18|>": 50573,
1213
+ "<|4.20|>": 50574,
1214
+ "<|4.22|>": 50575,
1215
+ "<|4.24|>": 50576,
1216
+ "<|4.26|>": 50577,
1217
+ "<|4.28|>": 50578,
1218
+ "<|4.30|>": 50579,
1219
+ "<|4.32|>": 50580,
1220
+ "<|4.34|>": 50581,
1221
+ "<|4.36|>": 50582,
1222
+ "<|4.38|>": 50583,
1223
+ "<|4.40|>": 50584,
1224
+ "<|4.42|>": 50585,
1225
+ "<|4.44|>": 50586,
1226
+ "<|4.46|>": 50587,
1227
+ "<|4.48|>": 50588,
1228
+ "<|4.50|>": 50589,
1229
+ "<|4.52|>": 50590,
1230
+ "<|4.54|>": 50591,
1231
+ "<|4.56|>": 50592,
1232
+ "<|4.58|>": 50593,
1233
+ "<|4.60|>": 50594,
1234
+ "<|4.62|>": 50595,
1235
+ "<|4.64|>": 50596,
1236
+ "<|4.66|>": 50597,
1237
+ "<|4.68|>": 50598,
1238
+ "<|4.70|>": 50599,
1239
+ "<|4.72|>": 50600,
1240
+ "<|4.74|>": 50601,
1241
+ "<|4.76|>": 50602,
1242
+ "<|4.78|>": 50603,
1243
+ "<|4.80|>": 50604,
1244
+ "<|4.82|>": 50605,
1245
+ "<|4.84|>": 50606,
1246
+ "<|4.86|>": 50607,
1247
+ "<|4.88|>": 50608,
1248
+ "<|4.90|>": 50609,
1249
+ "<|4.92|>": 50610,
1250
+ "<|4.94|>": 50611,
1251
+ "<|4.96|>": 50612,
1252
+ "<|4.98|>": 50613,
1253
+ "<|5.00|>": 50614,
1254
+ "<|5.02|>": 50615,
1255
+ "<|5.04|>": 50616,
1256
+ "<|5.06|>": 50617,
1257
+ "<|5.08|>": 50618,
1258
+ "<|5.10|>": 50619,
1259
+ "<|5.12|>": 50620,
1260
+ "<|5.14|>": 50621,
1261
+ "<|5.16|>": 50622,
1262
+ "<|5.18|>": 50623,
1263
+ "<|5.20|>": 50624,
1264
+ "<|5.22|>": 50625,
1265
+ "<|5.24|>": 50626,
1266
+ "<|5.26|>": 50627,
1267
+ "<|5.28|>": 50628,
1268
+ "<|5.30|>": 50629,
1269
+ "<|5.32|>": 50630,
1270
+ "<|5.34|>": 50631,
1271
+ "<|5.36|>": 50632,
1272
+ "<|5.38|>": 50633,
1273
+ "<|5.40|>": 50634,
1274
+ "<|5.42|>": 50635,
1275
+ "<|5.44|>": 50636,
1276
+ "<|5.46|>": 50637,
1277
+ "<|5.48|>": 50638,
1278
+ "<|5.50|>": 50639,
1279
+ "<|5.52|>": 50640,
1280
+ "<|5.54|>": 50641,
1281
+ "<|5.56|>": 50642,
1282
+ "<|5.58|>": 50643,
1283
+ "<|5.60|>": 50644,
1284
+ "<|5.62|>": 50645,
1285
+ "<|5.64|>": 50646,
1286
+ "<|5.66|>": 50647,
1287
+ "<|5.68|>": 50648,
1288
+ "<|5.70|>": 50649,
1289
+ "<|5.72|>": 50650,
1290
+ "<|5.74|>": 50651,
1291
+ "<|5.76|>": 50652,
1292
+ "<|5.78|>": 50653,
1293
+ "<|5.80|>": 50654,
1294
+ "<|5.82|>": 50655,
1295
+ "<|5.84|>": 50656,
1296
+ "<|5.86|>": 50657,
1297
+ "<|5.88|>": 50658,
1298
+ "<|5.90|>": 50659,
1299
+ "<|5.92|>": 50660,
1300
+ "<|5.94|>": 50661,
1301
+ "<|5.96|>": 50662,
1302
+ "<|5.98|>": 50663,
1303
+ "<|6.00|>": 50664,
1304
+ "<|6.02|>": 50665,
1305
+ "<|6.04|>": 50666,
1306
+ "<|6.06|>": 50667,
1307
+ "<|6.08|>": 50668,
1308
+ "<|6.10|>": 50669,
1309
+ "<|6.12|>": 50670,
1310
+ "<|6.14|>": 50671,
1311
+ "<|6.16|>": 50672,
1312
+ "<|6.18|>": 50673,
1313
+ "<|6.20|>": 50674,
1314
+ "<|6.22|>": 50675,
1315
+ "<|6.24|>": 50676,
1316
+ "<|6.26|>": 50677,
1317
+ "<|6.28|>": 50678,
1318
+ "<|6.30|>": 50679,
1319
+ "<|6.32|>": 50680,
1320
+ "<|6.34|>": 50681,
1321
+ "<|6.36|>": 50682,
1322
+ "<|6.38|>": 50683,
1323
+ "<|6.40|>": 50684,
1324
+ "<|6.42|>": 50685,
1325
+ "<|6.44|>": 50686,
1326
+ "<|6.46|>": 50687,
1327
+ "<|6.48|>": 50688,
1328
+ "<|6.50|>": 50689,
1329
+ "<|6.52|>": 50690,
1330
+ "<|6.54|>": 50691,
1331
+ "<|6.56|>": 50692,
1332
+ "<|6.58|>": 50693,
1333
+ "<|6.60|>": 50694,
1334
+ "<|6.62|>": 50695,
1335
+ "<|6.64|>": 50696,
1336
+ "<|6.66|>": 50697,
1337
+ "<|6.68|>": 50698,
1338
+ "<|6.70|>": 50699,
1339
+ "<|6.72|>": 50700,
1340
+ "<|6.74|>": 50701,
1341
+ "<|6.76|>": 50702,
1342
+ "<|6.78|>": 50703,
1343
+ "<|6.80|>": 50704,
1344
+ "<|6.82|>": 50705,
1345
+ "<|6.84|>": 50706,
1346
+ "<|6.86|>": 50707,
1347
+ "<|6.88|>": 50708,
1348
+ "<|6.90|>": 50709,
1349
+ "<|6.92|>": 50710,
1350
+ "<|6.94|>": 50711,
1351
+ "<|6.96|>": 50712,
1352
+ "<|6.98|>": 50713,
1353
+ "<|7.00|>": 50714,
1354
+ "<|7.02|>": 50715,
1355
+ "<|7.04|>": 50716,
1356
+ "<|7.06|>": 50717,
1357
+ "<|7.08|>": 50718,
1358
+ "<|7.10|>": 50719,
1359
+ "<|7.12|>": 50720,
1360
+ "<|7.14|>": 50721,
1361
+ "<|7.16|>": 50722,
1362
+ "<|7.18|>": 50723,
1363
+ "<|7.20|>": 50724,
1364
+ "<|7.22|>": 50725,
1365
+ "<|7.24|>": 50726,
1366
+ "<|7.26|>": 50727,
1367
+ "<|7.28|>": 50728,
1368
+ "<|7.30|>": 50729,
1369
+ "<|7.32|>": 50730,
1370
+ "<|7.34|>": 50731,
1371
+ "<|7.36|>": 50732,
1372
+ "<|7.38|>": 50733,
1373
+ "<|7.40|>": 50734,
1374
+ "<|7.42|>": 50735,
1375
+ "<|7.44|>": 50736,
1376
+ "<|7.46|>": 50737,
1377
+ "<|7.48|>": 50738,
1378
+ "<|7.50|>": 50739,
1379
+ "<|7.52|>": 50740,
1380
+ "<|7.54|>": 50741,
1381
+ "<|7.56|>": 50742,
1382
+ "<|7.58|>": 50743,
1383
+ "<|7.60|>": 50744,
1384
+ "<|7.62|>": 50745,
1385
+ "<|7.64|>": 50746,
1386
+ "<|7.66|>": 50747,
1387
+ "<|7.68|>": 50748,
1388
+ "<|7.70|>": 50749,
1389
+ "<|7.72|>": 50750,
1390
+ "<|7.74|>": 50751,
1391
+ "<|7.76|>": 50752,
1392
+ "<|7.78|>": 50753,
1393
+ "<|7.80|>": 50754,
1394
+ "<|7.82|>": 50755,
1395
+ "<|7.84|>": 50756,
1396
+ "<|7.86|>": 50757,
1397
+ "<|7.88|>": 50758,
1398
+ "<|7.90|>": 50759,
1399
+ "<|7.92|>": 50760,
1400
+ "<|7.94|>": 50761,
1401
+ "<|7.96|>": 50762,
1402
+ "<|7.98|>": 50763,
1403
+ "<|8.00|>": 50764,
1404
+ "<|8.02|>": 50765,
1405
+ "<|8.04|>": 50766,
1406
+ "<|8.06|>": 50767,
1407
+ "<|8.08|>": 50768,
1408
+ "<|8.10|>": 50769,
1409
+ "<|8.12|>": 50770,
1410
+ "<|8.14|>": 50771,
1411
+ "<|8.16|>": 50772,
1412
+ "<|8.18|>": 50773,
1413
+ "<|8.20|>": 50774,
1414
+ "<|8.22|>": 50775,
1415
+ "<|8.24|>": 50776,
1416
+ "<|8.26|>": 50777,
1417
+ "<|8.28|>": 50778,
1418
+ "<|8.30|>": 50779,
1419
+ "<|8.32|>": 50780,
1420
+ "<|8.34|>": 50781,
1421
+ "<|8.36|>": 50782,
1422
+ "<|8.38|>": 50783,
1423
+ "<|8.40|>": 50784,
1424
+ "<|8.42|>": 50785,
1425
+ "<|8.44|>": 50786,
1426
+ "<|8.46|>": 50787,
1427
+ "<|8.48|>": 50788,
1428
+ "<|8.50|>": 50789,
1429
+ "<|8.52|>": 50790,
1430
+ "<|8.54|>": 50791,
1431
+ "<|8.56|>": 50792,
1432
+ "<|8.58|>": 50793,
1433
+ "<|8.60|>": 50794,
1434
+ "<|8.62|>": 50795,
1435
+ "<|8.64|>": 50796,
1436
+ "<|8.66|>": 50797,
1437
+ "<|8.68|>": 50798,
1438
+ "<|8.70|>": 50799,
1439
+ "<|8.72|>": 50800,
1440
+ "<|8.74|>": 50801,
1441
+ "<|8.76|>": 50802,
1442
+ "<|8.78|>": 50803,
1443
+ "<|8.80|>": 50804,
1444
+ "<|8.82|>": 50805,
1445
+ "<|8.84|>": 50806,
1446
+ "<|8.86|>": 50807,
1447
+ "<|8.88|>": 50808,
1448
+ "<|8.90|>": 50809,
1449
+ "<|8.92|>": 50810,
1450
+ "<|8.94|>": 50811,
1451
+ "<|8.96|>": 50812,
1452
+ "<|8.98|>": 50813,
1453
+ "<|9.00|>": 50814,
1454
+ "<|9.02|>": 50815,
1455
+ "<|9.04|>": 50816,
1456
+ "<|9.06|>": 50817,
1457
+ "<|9.08|>": 50818,
1458
+ "<|9.10|>": 50819,
1459
+ "<|9.12|>": 50820,
1460
+ "<|9.14|>": 50821,
1461
+ "<|9.16|>": 50822,
1462
+ "<|9.18|>": 50823,
1463
+ "<|9.20|>": 50824,
1464
+ "<|9.22|>": 50825,
1465
+ "<|9.24|>": 50826,
1466
+ "<|9.26|>": 50827,
1467
+ "<|9.28|>": 50828,
1468
+ "<|9.30|>": 50829,
1469
+ "<|9.32|>": 50830,
1470
+ "<|9.34|>": 50831,
1471
+ "<|9.36|>": 50832,
1472
+ "<|9.38|>": 50833,
1473
+ "<|9.40|>": 50834,
1474
+ "<|9.42|>": 50835,
1475
+ "<|9.44|>": 50836,
1476
+ "<|9.46|>": 50837,
1477
+ "<|9.48|>": 50838,
1478
+ "<|9.50|>": 50839,
1479
+ "<|9.52|>": 50840,
1480
+ "<|9.54|>": 50841,
1481
+ "<|9.56|>": 50842,
1482
+ "<|9.58|>": 50843,
1483
+ "<|9.60|>": 50844,
1484
+ "<|9.62|>": 50845,
1485
+ "<|9.64|>": 50846,
1486
+ "<|9.66|>": 50847,
1487
+ "<|9.68|>": 50848,
1488
+ "<|9.70|>": 50849,
1489
+ "<|9.72|>": 50850,
1490
+ "<|9.74|>": 50851,
1491
+ "<|9.76|>": 50852,
1492
+ "<|9.78|>": 50853,
1493
+ "<|9.80|>": 50854,
1494
+ "<|9.82|>": 50855,
1495
+ "<|9.84|>": 50856,
1496
+ "<|9.86|>": 50857,
1497
+ "<|9.88|>": 50858,
1498
+ "<|9.90|>": 50859,
1499
+ "<|9.92|>": 50860,
1500
+ "<|9.94|>": 50861,
1501
+ "<|9.96|>": 50862,
1502
+ "<|9.98|>": 50863,
1503
  "<|af|>": 50327,
1504
  "<|am|>": 50334,
1505
  "<|ar|>": 50272,
all_results.json CHANGED
@@ -1,12 +1,8 @@
1
  {
2
- "epoch": 83.33,
3
- "eval_loss": 0.50341796875,
4
- "eval_runtime": 154.187,
5
- "eval_samples_per_second": 13.166,
6
- "eval_steps_per_second": 0.415,
7
- "eval_wer": 27.279438445464898,
8
- "train_loss": 0.031859876942634584,
9
- "train_runtime": 20088.5298,
10
- "train_samples_per_second": 15.929,
11
- "train_steps_per_second": 0.249
12
  }
 
1
  {
2
+ "epoch": 0.82,
3
+ "train_loss": 0.5370022094726562,
4
+ "train_runtime": 11752.0797,
5
+ "train_samples": 194401,
6
+ "train_samples_per_second": 13.615,
7
+ "train_steps_per_second": 0.851
 
 
 
 
8
  }
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "whisper-th-small-aug",
3
  "activation_dropout": 0.0,
4
  "activation_function": "gelu",
5
  "apply_spec_augment": true,
@@ -30,10 +30,10 @@
30
  "is_encoder_decoder": true,
31
  "mask_feature_length": 64,
32
  "mask_feature_min_masks": 0,
33
- "mask_feature_prob": 0.3,
34
  "mask_time_length": 10,
35
  "mask_time_min_masks": 2,
36
- "mask_time_prob": 0.3,
37
  "max_length": 448,
38
  "max_source_positions": 1500,
39
  "max_target_positions": 448,
@@ -43,8 +43,8 @@
43
  "num_mel_bins": 80,
44
  "pad_token_id": 50257,
45
  "scale_embedding": false,
46
- "torch_dtype": "float32",
47
- "transformers_version": "4.31.0.dev0",
48
  "use_cache": true,
49
  "use_weighted_layer_sum": false,
50
  "vocab_size": 51865
 
1
  {
2
+ "_name_or_path": "openai/whisper-small",
3
  "activation_dropout": 0.0,
4
  "activation_function": "gelu",
5
  "apply_spec_augment": true,
 
30
  "is_encoder_decoder": true,
31
  "mask_feature_length": 64,
32
  "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.1,
34
  "mask_time_length": 10,
35
  "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.1,
37
  "max_length": 448,
38
  "max_source_positions": 1500,
39
  "max_target_positions": 448,
 
43
  "num_mel_bins": 80,
44
  "pad_token_id": 50257,
45
  "scale_embedding": false,
46
+ "torch_dtype": "float16",
47
+ "transformers_version": "4.37.2",
48
  "use_cache": true,
49
  "use_weighted_layer_sum": false,
50
  "vocab_size": 51865
dsconfig.json DELETED
@@ -1,50 +0,0 @@
1
- {
2
- "fp16": {
3
- "enabled": "auto",
4
- "loss_scale": 0,
5
- "loss_scale_window": 1000,
6
- "initial_scale_power": 16,
7
- "hysteresis": 2,
8
- "min_loss_scale": 1
9
- },
10
-
11
- "optimizer": {
12
- "type": "AdamW",
13
- "params": {
14
- "lr": "auto",
15
- "betas": "auto",
16
- "eps": "auto",
17
- "weight_decay": "auto"
18
- }
19
- },
20
-
21
- "scheduler": {
22
- "type": "WarmupDecayLR",
23
- "params": {
24
- "last_batch_iteration": -1,
25
- "total_num_steps": "auto",
26
- "warmup_min_lr": "auto",
27
- "warmup_max_lr": "auto",
28
- "warmup_num_steps": "auto"
29
- }
30
- },
31
-
32
- "zero_optimization": {
33
- "stage": 2,
34
- "offload_optimizer": {
35
- "device": "cpu",
36
- "pin_memory": true
37
- },
38
- "allgather_partitions": true,
39
- "allgather_bucket_size": 2e8,
40
- "overlap_comm": true,
41
- "reduce_scatter": true,
42
- "reduce_bucket_size": 2e8,
43
- "contiguous_gradients": true
44
- },
45
-
46
- "gradient_accumulation_steps": "auto",
47
- "gradient_clipping": "auto",
48
- "train_batch_size": "auto",
49
- "train_micro_batch_size_per_gpu": "auto"
50
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
eval_results.json DELETED
@@ -1,8 +0,0 @@
1
- {
2
- "epoch": 83.33,
3
- "eval_loss": 0.50341796875,
4
- "eval_runtime": 154.187,
5
- "eval_samples_per_second": 13.166,
6
- "eval_steps_per_second": 0.415,
7
- "eval_wer": 27.279438445464898
8
- }
 
 
 
 
 
 
 
 
 
generation_config.json CHANGED
@@ -1,4 +1,46 @@
1
  {
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  "begin_suppress_tokens": [
3
  220,
4
  50257
@@ -118,10 +160,11 @@
118
  "<|yo|>": 50325,
119
  "<|zh|>": 50260
120
  },
121
- "max_initial_timestamp_index": 1,
122
  "max_length": 448,
123
  "no_timestamps_token_id": 50363,
124
  "pad_token_id": 50257,
 
125
  "return_timestamps": false,
126
  "suppress_tokens": [
127
  1,
@@ -217,5 +260,5 @@
217
  "transcribe": 50359,
218
  "translate": 50358
219
  },
220
- "transformers_version": "4.31.0.dev0"
221
  }
 
1
  {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
  "begin_suppress_tokens": [
45
  220,
46
  50257
 
160
  "<|yo|>": 50325,
161
  "<|zh|>": 50260
162
  },
163
+ "max_initial_timestamp_index": 50,
164
  "max_length": 448,
165
  "no_timestamps_token_id": 50363,
166
  "pad_token_id": 50257,
167
+ "prev_sot_token_id": 50361,
168
  "return_timestamps": false,
169
  "suppress_tokens": [
170
  1,
 
260
  "transcribe": 50359,
261
  "translate": 50358
262
  },
263
+ "transformers_version": "4.37.2"
264
  }
runs/Feb05_19-39-15_rtx-i7/events.out.tfevents.1675600822.rtx-i7 → model.safetensors RENAMED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1cf4cac77a972e11e49522adfea296d163095148616dfbb0a0edf48b79729bd2
3
- size 36381
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d7eb5e67f4593661f6873617291d82ef9a095e156375c927151bf05a998612d7
3
+ size 563189936
pytorch_model.bin DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:6c48b903002d62eddd6bdac0af1fcb9d1825f7616b03562dbceaa0a029943206
3
- size 967102926
 
 
 
 
runs/Feb05_19-39-15_rtx-i7/1675600822.880342/events.out.tfevents.1675600822.rtx-i7 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:293238aa87be01a6b1962e6b3dc488ce4a5a65aefcb3a0d48a86c9927293f5b6
3
- size 5667
 
 
 
 
runs/Feb05_19-39-15_rtx-i7/events.out.tfevents.1675621065.rtx-i7 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:717c7fedcce3f5f5a022b22c64e6ed0dd34cba77e2363e0c130ca0f9eae0fd4a
3
- size 358
 
 
 
 
special_tokens_map.json CHANGED
@@ -111,22 +111,28 @@
111
  "bos_token": {
112
  "content": "<|endoftext|>",
113
  "lstrip": false,
114
- "normalized": true,
115
  "rstrip": false,
116
  "single_word": false
117
  },
118
  "eos_token": {
119
  "content": "<|endoftext|>",
120
  "lstrip": false,
121
- "normalized": true,
 
 
 
 
 
 
 
122
  "rstrip": false,
123
  "single_word": false
124
  },
125
- "pad_token": "<|endoftext|>",
126
  "unk_token": {
127
  "content": "<|endoftext|>",
128
  "lstrip": false,
129
- "normalized": true,
130
  "rstrip": false,
131
  "single_word": false
132
  }
 
111
  "bos_token": {
112
  "content": "<|endoftext|>",
113
  "lstrip": false,
114
+ "normalized": false,
115
  "rstrip": false,
116
  "single_word": false
117
  },
118
  "eos_token": {
119
  "content": "<|endoftext|>",
120
  "lstrip": false,
121
+ "normalized": false,
122
+ "rstrip": false,
123
+ "single_word": false
124
+ },
125
+ "pad_token": {
126
+ "content": "<|endoftext|>",
127
+ "lstrip": false,
128
+ "normalized": false,
129
  "rstrip": false,
130
  "single_word": false
131
  },
 
132
  "unk_token": {
133
  "content": "<|endoftext|>",
134
  "lstrip": false,
135
+ "normalized": false,
136
  "rstrip": false,
137
  "single_word": false
138
  }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json CHANGED
The diff for this file is too large to render. See raw diff
 
train_results.json DELETED
@@ -1,7 +0,0 @@
1
- {
2
- "epoch": 83.33,
3
- "train_loss": 0.031859876942634584,
4
- "train_runtime": 20088.5298,
5
- "train_samples_per_second": 15.929,
6
- "train_steps_per_second": 0.249
7
- }
 
 
 
 
 
 
 
 
trainer_state.json DELETED
@@ -1,1234 +0,0 @@
1
- {
2
- "best_metric": 27.279438445464898,
3
- "best_model_checkpoint": "./checkpoint-5000",
4
- "epoch": 83.33333333333333,
5
- "global_step": 5000,
6
- "is_hyper_param_search": false,
7
- "is_local_process_zero": true,
8
- "is_world_process_zero": true,
9
- "log_history": [
10
- {
11
- "epoch": 0.42,
12
- "learning_rate": 4.973833272194737e-06,
13
- "loss": 0.7626,
14
- "step": 25
15
- },
16
- {
17
- "epoch": 0.83,
18
- "learning_rate": 6.195318418690893e-06,
19
- "loss": 0.6204,
20
- "step": 50
21
- },
22
- {
23
- "epoch": 1.25,
24
- "learning_rate": 6.881634451095711e-06,
25
- "loss": 0.5306,
26
- "step": 75
27
- },
28
- {
29
- "epoch": 1.67,
30
- "learning_rate": 7.361221988663844e-06,
31
- "loss": 0.4583,
32
- "step": 100
33
- },
34
- {
35
- "epoch": 2.08,
36
- "learning_rate": 7.730207550743121e-06,
37
- "loss": 0.4212,
38
- "step": 125
39
- },
40
- {
41
- "epoch": 2.5,
42
- "learning_rate": 8.03016458599496e-06,
43
- "loss": 0.3621,
44
- "step": 150
45
- },
46
- {
47
- "epoch": 2.92,
48
- "learning_rate": 8.282894746203441e-06,
49
- "loss": 0.3433,
50
- "step": 175
51
- },
52
- {
53
- "epoch": 3.33,
54
- "learning_rate": 8.501266121799902e-06,
55
- "loss": 0.2989,
56
- "step": 200
57
- },
58
- {
59
- "epoch": 3.75,
60
- "learning_rate": 8.693512601774437e-06,
61
- "loss": 0.2812,
62
- "step": 225
63
- },
64
- {
65
- "epoch": 4.17,
66
- "learning_rate": 8.865222471593567e-06,
67
- "loss": 0.2559,
68
- "step": 250
69
- },
70
- {
71
- "epoch": 4.58,
72
- "learning_rate": 9.020362953730323e-06,
73
- "loss": 0.2217,
74
- "step": 275
75
- },
76
- {
77
- "epoch": 5.0,
78
- "learning_rate": 9.161852281961698e-06,
79
- "loss": 0.2143,
80
- "step": 300
81
- },
82
- {
83
- "epoch": 5.42,
84
- "learning_rate": 9.29189975311636e-06,
85
- "loss": 0.1723,
86
- "step": 325
87
- },
88
- {
89
- "epoch": 5.83,
90
- "learning_rate": 9.412218256259678e-06,
91
- "loss": 0.1688,
92
- "step": 350
93
- },
94
- {
95
- "epoch": 6.25,
96
- "learning_rate": 9.524162683365145e-06,
97
- "loss": 0.1442,
98
- "step": 375
99
- },
100
- {
101
- "epoch": 6.67,
102
- "learning_rate": 9.62882322733502e-06,
103
- "loss": 0.1264,
104
- "step": 400
105
- },
106
- {
107
- "epoch": 7.08,
108
- "learning_rate": 9.727090137141168e-06,
109
- "loss": 0.1243,
110
- "step": 425
111
- },
112
- {
113
- "epoch": 7.5,
114
- "learning_rate": 9.819699807237934e-06,
115
- "loss": 0.094,
116
- "step": 450
117
- },
118
- {
119
- "epoch": 7.92,
120
- "learning_rate": 9.907268307310855e-06,
121
- "loss": 0.0933,
122
- "step": 475
123
- },
124
- {
125
- "epoch": 8.33,
126
- "learning_rate": 9.990316248055788e-06,
127
- "loss": 0.0702,
128
- "step": 500
129
- },
130
- {
131
- "epoch": 8.75,
132
- "learning_rate": 9.953333333333333e-06,
133
- "loss": 0.0678,
134
- "step": 525
135
- },
136
- {
137
- "epoch": 9.17,
138
- "learning_rate": 9.89777777777778e-06,
139
- "loss": 0.0585,
140
- "step": 550
141
- },
142
- {
143
- "epoch": 9.58,
144
- "learning_rate": 9.842222222222223e-06,
145
- "loss": 0.046,
146
- "step": 575
147
- },
148
- {
149
- "epoch": 10.0,
150
- "learning_rate": 9.786666666666667e-06,
151
- "loss": 0.0464,
152
- "step": 600
153
- },
154
- {
155
- "epoch": 10.42,
156
- "learning_rate": 9.731111111111113e-06,
157
- "loss": 0.0319,
158
- "step": 625
159
- },
160
- {
161
- "epoch": 10.83,
162
- "learning_rate": 9.675555555555555e-06,
163
- "loss": 0.0315,
164
- "step": 650
165
- },
166
- {
167
- "epoch": 11.25,
168
- "learning_rate": 9.620000000000001e-06,
169
- "loss": 0.026,
170
- "step": 675
171
- },
172
- {
173
- "epoch": 11.67,
174
- "learning_rate": 9.564444444444445e-06,
175
- "loss": 0.0215,
176
- "step": 700
177
- },
178
- {
179
- "epoch": 12.08,
180
- "learning_rate": 9.508888888888889e-06,
181
- "loss": 0.0195,
182
- "step": 725
183
- },
184
- {
185
- "epoch": 12.5,
186
- "learning_rate": 9.453333333333335e-06,
187
- "loss": 0.0147,
188
- "step": 750
189
- },
190
- {
191
- "epoch": 12.92,
192
- "learning_rate": 9.397777777777779e-06,
193
- "loss": 0.0145,
194
- "step": 775
195
- },
196
- {
197
- "epoch": 13.33,
198
- "learning_rate": 9.342222222222223e-06,
199
- "loss": 0.0116,
200
- "step": 800
201
- },
202
- {
203
- "epoch": 13.75,
204
- "learning_rate": 9.286666666666667e-06,
205
- "loss": 0.0109,
206
- "step": 825
207
- },
208
- {
209
- "epoch": 14.17,
210
- "learning_rate": 9.231111111111111e-06,
211
- "loss": 0.0093,
212
- "step": 850
213
- },
214
- {
215
- "epoch": 14.58,
216
- "learning_rate": 9.175555555555557e-06,
217
- "loss": 0.0079,
218
- "step": 875
219
- },
220
- {
221
- "epoch": 15.0,
222
- "learning_rate": 9.12e-06,
223
- "loss": 0.0077,
224
- "step": 900
225
- },
226
- {
227
- "epoch": 15.42,
228
- "learning_rate": 9.064444444444447e-06,
229
- "loss": 0.0064,
230
- "step": 925
231
- },
232
- {
233
- "epoch": 15.83,
234
- "learning_rate": 9.008888888888889e-06,
235
- "loss": 0.006,
236
- "step": 950
237
- },
238
- {
239
- "epoch": 16.25,
240
- "learning_rate": 8.953333333333335e-06,
241
- "loss": 0.0051,
242
- "step": 975
243
- },
244
- {
245
- "epoch": 16.67,
246
- "learning_rate": 8.897777777777779e-06,
247
- "loss": 0.0046,
248
- "step": 1000
249
- },
250
- {
251
- "epoch": 17.08,
252
- "learning_rate": 8.842222222222223e-06,
253
- "loss": 0.005,
254
- "step": 1025
255
- },
256
- {
257
- "epoch": 17.5,
258
- "learning_rate": 8.786666666666668e-06,
259
- "loss": 0.0043,
260
- "step": 1050
261
- },
262
- {
263
- "epoch": 17.92,
264
- "learning_rate": 8.73111111111111e-06,
265
- "loss": 0.0039,
266
- "step": 1075
267
- },
268
- {
269
- "epoch": 18.33,
270
- "learning_rate": 8.675555555555556e-06,
271
- "loss": 0.003,
272
- "step": 1100
273
- },
274
- {
275
- "epoch": 18.75,
276
- "learning_rate": 8.62e-06,
277
- "loss": 0.0029,
278
- "step": 1125
279
- },
280
- {
281
- "epoch": 19.17,
282
- "learning_rate": 8.564444444444445e-06,
283
- "loss": 0.0029,
284
- "step": 1150
285
- },
286
- {
287
- "epoch": 19.58,
288
- "learning_rate": 8.50888888888889e-06,
289
- "loss": 0.0025,
290
- "step": 1175
291
- },
292
- {
293
- "epoch": 20.0,
294
- "learning_rate": 8.453333333333334e-06,
295
- "loss": 0.0023,
296
- "step": 1200
297
- },
298
- {
299
- "epoch": 20.42,
300
- "learning_rate": 8.397777777777778e-06,
301
- "loss": 0.002,
302
- "step": 1225
303
- },
304
- {
305
- "epoch": 20.83,
306
- "learning_rate": 8.342222222222222e-06,
307
- "loss": 0.0019,
308
- "step": 1250
309
- },
310
- {
311
- "epoch": 21.25,
312
- "learning_rate": 8.286666666666668e-06,
313
- "loss": 0.0018,
314
- "step": 1275
315
- },
316
- {
317
- "epoch": 21.67,
318
- "learning_rate": 8.231111111111112e-06,
319
- "loss": 0.0017,
320
- "step": 1300
321
- },
322
- {
323
- "epoch": 22.08,
324
- "learning_rate": 8.175555555555556e-06,
325
- "loss": 0.0016,
326
- "step": 1325
327
- },
328
- {
329
- "epoch": 22.5,
330
- "learning_rate": 8.120000000000002e-06,
331
- "loss": 0.0015,
332
- "step": 1350
333
- },
334
- {
335
- "epoch": 22.92,
336
- "learning_rate": 8.064444444444444e-06,
337
- "loss": 0.0015,
338
- "step": 1375
339
- },
340
- {
341
- "epoch": 23.33,
342
- "learning_rate": 8.00888888888889e-06,
343
- "loss": 0.0014,
344
- "step": 1400
345
- },
346
- {
347
- "epoch": 23.75,
348
- "learning_rate": 7.953333333333334e-06,
349
- "loss": 0.0013,
350
- "step": 1425
351
- },
352
- {
353
- "epoch": 24.17,
354
- "learning_rate": 7.897777777777778e-06,
355
- "loss": 0.0013,
356
- "step": 1450
357
- },
358
- {
359
- "epoch": 24.58,
360
- "learning_rate": 7.842222222222224e-06,
361
- "loss": 0.0012,
362
- "step": 1475
363
- },
364
- {
365
- "epoch": 25.0,
366
- "learning_rate": 7.786666666666666e-06,
367
- "loss": 0.0012,
368
- "step": 1500
369
- },
370
- {
371
- "epoch": 25.42,
372
- "learning_rate": 7.731111111111112e-06,
373
- "loss": 0.0011,
374
- "step": 1525
375
- },
376
- {
377
- "epoch": 25.83,
378
- "learning_rate": 7.675555555555556e-06,
379
- "loss": 0.0011,
380
- "step": 1550
381
- },
382
- {
383
- "epoch": 26.25,
384
- "learning_rate": 7.620000000000001e-06,
385
- "loss": 0.001,
386
- "step": 1575
387
- },
388
- {
389
- "epoch": 26.67,
390
- "learning_rate": 7.564444444444446e-06,
391
- "loss": 0.001,
392
- "step": 1600
393
- },
394
- {
395
- "epoch": 27.08,
396
- "learning_rate": 7.50888888888889e-06,
397
- "loss": 0.0014,
398
- "step": 1625
399
- },
400
- {
401
- "epoch": 27.5,
402
- "learning_rate": 7.453333333333334e-06,
403
- "loss": 0.0024,
404
- "step": 1650
405
- },
406
- {
407
- "epoch": 27.92,
408
- "learning_rate": 7.3977777777777786e-06,
409
- "loss": 0.0033,
410
- "step": 1675
411
- },
412
- {
413
- "epoch": 28.33,
414
- "learning_rate": 7.342222222222223e-06,
415
- "loss": 0.0051,
416
- "step": 1700
417
- },
418
- {
419
- "epoch": 28.75,
420
- "learning_rate": 7.2866666666666675e-06,
421
- "loss": 0.0061,
422
- "step": 1725
423
- },
424
- {
425
- "epoch": 29.17,
426
- "learning_rate": 7.231111111111112e-06,
427
- "loss": 0.0063,
428
- "step": 1750
429
- },
430
- {
431
- "epoch": 29.58,
432
- "learning_rate": 7.1755555555555556e-06,
433
- "loss": 0.0058,
434
- "step": 1775
435
- },
436
- {
437
- "epoch": 30.0,
438
- "learning_rate": 7.1200000000000004e-06,
439
- "loss": 0.0055,
440
- "step": 1800
441
- },
442
- {
443
- "epoch": 30.42,
444
- "learning_rate": 7.0644444444444445e-06,
445
- "loss": 0.0043,
446
- "step": 1825
447
- },
448
- {
449
- "epoch": 30.83,
450
- "learning_rate": 7.008888888888889e-06,
451
- "loss": 0.0047,
452
- "step": 1850
453
- },
454
- {
455
- "epoch": 31.25,
456
- "learning_rate": 6.953333333333334e-06,
457
- "loss": 0.0047,
458
- "step": 1875
459
- },
460
- {
461
- "epoch": 31.67,
462
- "learning_rate": 6.897777777777779e-06,
463
- "loss": 0.0044,
464
- "step": 1900
465
- },
466
- {
467
- "epoch": 32.08,
468
- "learning_rate": 6.842222222222222e-06,
469
- "loss": 0.0034,
470
- "step": 1925
471
- },
472
- {
473
- "epoch": 32.5,
474
- "learning_rate": 6.786666666666667e-06,
475
- "loss": 0.0034,
476
- "step": 1950
477
- },
478
- {
479
- "epoch": 32.92,
480
- "learning_rate": 6.731111111111111e-06,
481
- "loss": 0.0029,
482
- "step": 1975
483
- },
484
- {
485
- "epoch": 33.33,
486
- "learning_rate": 6.675555555555556e-06,
487
- "loss": 0.0026,
488
- "step": 2000
489
- },
490
- {
491
- "epoch": 33.75,
492
- "learning_rate": 6.6244444444444445e-06,
493
- "loss": 0.0019,
494
- "step": 2025
495
- },
496
- {
497
- "epoch": 34.17,
498
- "learning_rate": 6.568888888888889e-06,
499
- "loss": 0.0015,
500
- "step": 2050
501
- },
502
- {
503
- "epoch": 34.58,
504
- "learning_rate": 6.513333333333333e-06,
505
- "loss": 0.0012,
506
- "step": 2075
507
- },
508
- {
509
- "epoch": 35.0,
510
- "learning_rate": 6.457777777777778e-06,
511
- "loss": 0.0011,
512
- "step": 2100
513
- },
514
- {
515
- "epoch": 35.42,
516
- "learning_rate": 6.402222222222223e-06,
517
- "loss": 0.0009,
518
- "step": 2125
519
- },
520
- {
521
- "epoch": 35.83,
522
- "learning_rate": 6.346666666666668e-06,
523
- "loss": 0.001,
524
- "step": 2150
525
- },
526
- {
527
- "epoch": 36.25,
528
- "learning_rate": 6.291111111111111e-06,
529
- "loss": 0.0008,
530
- "step": 2175
531
- },
532
- {
533
- "epoch": 36.67,
534
- "learning_rate": 6.235555555555556e-06,
535
- "loss": 0.0007,
536
- "step": 2200
537
- },
538
- {
539
- "epoch": 37.08,
540
- "learning_rate": 6.18e-06,
541
- "loss": 0.0007,
542
- "step": 2225
543
- },
544
- {
545
- "epoch": 37.5,
546
- "learning_rate": 6.124444444444445e-06,
547
- "loss": 0.0007,
548
- "step": 2250
549
- },
550
- {
551
- "epoch": 37.92,
552
- "learning_rate": 6.06888888888889e-06,
553
- "loss": 0.0008,
554
- "step": 2275
555
- },
556
- {
557
- "epoch": 38.33,
558
- "learning_rate": 6.013333333333335e-06,
559
- "loss": 0.0008,
560
- "step": 2300
561
- },
562
- {
563
- "epoch": 38.75,
564
- "learning_rate": 5.957777777777778e-06,
565
- "loss": 0.0008,
566
- "step": 2325
567
- },
568
- {
569
- "epoch": 39.17,
570
- "learning_rate": 5.902222222222223e-06,
571
- "loss": 0.0007,
572
- "step": 2350
573
- },
574
- {
575
- "epoch": 39.58,
576
- "learning_rate": 5.846666666666667e-06,
577
- "loss": 0.0008,
578
- "step": 2375
579
- },
580
- {
581
- "epoch": 40.0,
582
- "learning_rate": 5.791111111111112e-06,
583
- "loss": 0.0007,
584
- "step": 2400
585
- },
586
- {
587
- "epoch": 40.42,
588
- "learning_rate": 5.735555555555557e-06,
589
- "loss": 0.0006,
590
- "step": 2425
591
- },
592
- {
593
- "epoch": 40.83,
594
- "learning_rate": 5.68e-06,
595
- "loss": 0.0006,
596
- "step": 2450
597
- },
598
- {
599
- "epoch": 41.25,
600
- "learning_rate": 5.624444444444445e-06,
601
- "loss": 0.0005,
602
- "step": 2475
603
- },
604
- {
605
- "epoch": 41.67,
606
- "learning_rate": 5.56888888888889e-06,
607
- "loss": 0.0005,
608
- "step": 2500
609
- },
610
- {
611
- "epoch": 42.08,
612
- "learning_rate": 5.513333333333334e-06,
613
- "loss": 0.0005,
614
- "step": 2525
615
- },
616
- {
617
- "epoch": 42.5,
618
- "learning_rate": 5.4577777777777785e-06,
619
- "loss": 0.0005,
620
- "step": 2550
621
- },
622
- {
623
- "epoch": 42.92,
624
- "learning_rate": 5.402222222222223e-06,
625
- "loss": 0.0005,
626
- "step": 2575
627
- },
628
- {
629
- "epoch": 43.33,
630
- "learning_rate": 5.346666666666667e-06,
631
- "loss": 0.0005,
632
- "step": 2600
633
- },
634
- {
635
- "epoch": 43.75,
636
- "learning_rate": 5.2911111111111115e-06,
637
- "loss": 0.0005,
638
- "step": 2625
639
- },
640
- {
641
- "epoch": 44.17,
642
- "learning_rate": 5.235555555555556e-06,
643
- "loss": 0.0004,
644
- "step": 2650
645
- },
646
- {
647
- "epoch": 44.58,
648
- "learning_rate": 5.18e-06,
649
- "loss": 0.0005,
650
- "step": 2675
651
- },
652
- {
653
- "epoch": 45.0,
654
- "learning_rate": 5.124444444444445e-06,
655
- "loss": 0.0004,
656
- "step": 2700
657
- },
658
- {
659
- "epoch": 45.42,
660
- "learning_rate": 5.06888888888889e-06,
661
- "loss": 0.0004,
662
- "step": 2725
663
- },
664
- {
665
- "epoch": 45.83,
666
- "learning_rate": 5.013333333333333e-06,
667
- "loss": 0.0004,
668
- "step": 2750
669
- },
670
- {
671
- "epoch": 46.25,
672
- "learning_rate": 4.957777777777778e-06,
673
- "loss": 0.0004,
674
- "step": 2775
675
- },
676
- {
677
- "epoch": 46.67,
678
- "learning_rate": 4.902222222222222e-06,
679
- "loss": 0.0004,
680
- "step": 2800
681
- },
682
- {
683
- "epoch": 47.08,
684
- "learning_rate": 4.846666666666667e-06,
685
- "loss": 0.0004,
686
- "step": 2825
687
- },
688
- {
689
- "epoch": 47.5,
690
- "learning_rate": 4.791111111111111e-06,
691
- "loss": 0.0004,
692
- "step": 2850
693
- },
694
- {
695
- "epoch": 47.92,
696
- "learning_rate": 4.735555555555556e-06,
697
- "loss": 0.0004,
698
- "step": 2875
699
- },
700
- {
701
- "epoch": 48.33,
702
- "learning_rate": 4.680000000000001e-06,
703
- "loss": 0.0004,
704
- "step": 2900
705
- },
706
- {
707
- "epoch": 48.75,
708
- "learning_rate": 4.624444444444445e-06,
709
- "loss": 0.0004,
710
- "step": 2925
711
- },
712
- {
713
- "epoch": 49.17,
714
- "learning_rate": 4.568888888888889e-06,
715
- "loss": 0.0003,
716
- "step": 2950
717
- },
718
- {
719
- "epoch": 49.58,
720
- "learning_rate": 4.513333333333333e-06,
721
- "loss": 0.0003,
722
- "step": 2975
723
- },
724
- {
725
- "epoch": 50.0,
726
- "learning_rate": 4.457777777777778e-06,
727
- "loss": 0.0003,
728
- "step": 3000
729
- },
730
- {
731
- "epoch": 50.42,
732
- "learning_rate": 4.406666666666667e-06,
733
- "loss": 0.0003,
734
- "step": 3025
735
- },
736
- {
737
- "epoch": 50.83,
738
- "learning_rate": 4.351111111111111e-06,
739
- "loss": 0.0003,
740
- "step": 3050
741
- },
742
- {
743
- "epoch": 51.25,
744
- "learning_rate": 4.295555555555556e-06,
745
- "loss": 0.0003,
746
- "step": 3075
747
- },
748
- {
749
- "epoch": 51.67,
750
- "learning_rate": 4.24e-06,
751
- "loss": 0.0003,
752
- "step": 3100
753
- },
754
- {
755
- "epoch": 52.08,
756
- "learning_rate": 4.184444444444445e-06,
757
- "loss": 0.0003,
758
- "step": 3125
759
- },
760
- {
761
- "epoch": 52.5,
762
- "learning_rate": 4.12888888888889e-06,
763
- "loss": 0.0003,
764
- "step": 3150
765
- },
766
- {
767
- "epoch": 52.92,
768
- "learning_rate": 4.073333333333334e-06,
769
- "loss": 0.0003,
770
- "step": 3175
771
- },
772
- {
773
- "epoch": 53.33,
774
- "learning_rate": 4.017777777777778e-06,
775
- "loss": 0.0003,
776
- "step": 3200
777
- },
778
- {
779
- "epoch": 53.75,
780
- "learning_rate": 3.962222222222222e-06,
781
- "loss": 0.0003,
782
- "step": 3225
783
- },
784
- {
785
- "epoch": 54.17,
786
- "learning_rate": 3.906666666666667e-06,
787
- "loss": 0.0003,
788
- "step": 3250
789
- },
790
- {
791
- "epoch": 54.58,
792
- "learning_rate": 3.851111111111112e-06,
793
- "loss": 0.0003,
794
- "step": 3275
795
- },
796
- {
797
- "epoch": 55.0,
798
- "learning_rate": 3.7955555555555557e-06,
799
- "loss": 0.0003,
800
- "step": 3300
801
- },
802
- {
803
- "epoch": 55.42,
804
- "learning_rate": 3.74e-06,
805
- "loss": 0.0003,
806
- "step": 3325
807
- },
808
- {
809
- "epoch": 55.83,
810
- "learning_rate": 3.684444444444445e-06,
811
- "loss": 0.0003,
812
- "step": 3350
813
- },
814
- {
815
- "epoch": 56.25,
816
- "learning_rate": 3.628888888888889e-06,
817
- "loss": 0.0003,
818
- "step": 3375
819
- },
820
- {
821
- "epoch": 56.67,
822
- "learning_rate": 3.5733333333333336e-06,
823
- "loss": 0.0003,
824
- "step": 3400
825
- },
826
- {
827
- "epoch": 57.08,
828
- "learning_rate": 3.5177777777777784e-06,
829
- "loss": 0.0003,
830
- "step": 3425
831
- },
832
- {
833
- "epoch": 57.5,
834
- "learning_rate": 3.4622222222222225e-06,
835
- "loss": 0.0003,
836
- "step": 3450
837
- },
838
- {
839
- "epoch": 57.92,
840
- "learning_rate": 3.406666666666667e-06,
841
- "loss": 0.0003,
842
- "step": 3475
843
- },
844
- {
845
- "epoch": 58.33,
846
- "learning_rate": 3.351111111111112e-06,
847
- "loss": 0.0003,
848
- "step": 3500
849
- },
850
- {
851
- "epoch": 58.75,
852
- "learning_rate": 3.295555555555556e-06,
853
- "loss": 0.0003,
854
- "step": 3525
855
- },
856
- {
857
- "epoch": 59.17,
858
- "learning_rate": 3.2400000000000003e-06,
859
- "loss": 0.0003,
860
- "step": 3550
861
- },
862
- {
863
- "epoch": 59.58,
864
- "learning_rate": 3.1844444444444444e-06,
865
- "loss": 0.0003,
866
- "step": 3575
867
- },
868
- {
869
- "epoch": 60.0,
870
- "learning_rate": 3.1288888888888892e-06,
871
- "loss": 0.0003,
872
- "step": 3600
873
- },
874
- {
875
- "epoch": 60.42,
876
- "learning_rate": 3.0733333333333337e-06,
877
- "loss": 0.0002,
878
- "step": 3625
879
- },
880
- {
881
- "epoch": 60.83,
882
- "learning_rate": 3.0177777777777777e-06,
883
- "loss": 0.0003,
884
- "step": 3650
885
- },
886
- {
887
- "epoch": 61.25,
888
- "learning_rate": 2.9622222222222226e-06,
889
- "loss": 0.0002,
890
- "step": 3675
891
- },
892
- {
893
- "epoch": 61.67,
894
- "learning_rate": 2.906666666666667e-06,
895
- "loss": 0.0002,
896
- "step": 3700
897
- },
898
- {
899
- "epoch": 62.08,
900
- "learning_rate": 2.851111111111111e-06,
901
- "loss": 0.0002,
902
- "step": 3725
903
- },
904
- {
905
- "epoch": 62.5,
906
- "learning_rate": 2.795555555555556e-06,
907
- "loss": 0.0002,
908
- "step": 3750
909
- },
910
- {
911
- "epoch": 62.92,
912
- "learning_rate": 2.7400000000000004e-06,
913
- "loss": 0.0002,
914
- "step": 3775
915
- },
916
- {
917
- "epoch": 63.33,
918
- "learning_rate": 2.6844444444444445e-06,
919
- "loss": 0.0002,
920
- "step": 3800
921
- },
922
- {
923
- "epoch": 63.75,
924
- "learning_rate": 2.6288888888888894e-06,
925
- "loss": 0.0002,
926
- "step": 3825
927
- },
928
- {
929
- "epoch": 64.17,
930
- "learning_rate": 2.573333333333334e-06,
931
- "loss": 0.0002,
932
- "step": 3850
933
- },
934
- {
935
- "epoch": 64.58,
936
- "learning_rate": 2.517777777777778e-06,
937
- "loss": 0.0002,
938
- "step": 3875
939
- },
940
- {
941
- "epoch": 65.0,
942
- "learning_rate": 2.4622222222222223e-06,
943
- "loss": 0.0002,
944
- "step": 3900
945
- },
946
- {
947
- "epoch": 65.42,
948
- "learning_rate": 2.4066666666666668e-06,
949
- "loss": 0.0002,
950
- "step": 3925
951
- },
952
- {
953
- "epoch": 65.83,
954
- "learning_rate": 2.3511111111111112e-06,
955
- "loss": 0.0002,
956
- "step": 3950
957
- },
958
- {
959
- "epoch": 66.25,
960
- "learning_rate": 2.2955555555555557e-06,
961
- "loss": 0.0002,
962
- "step": 3975
963
- },
964
- {
965
- "epoch": 66.67,
966
- "learning_rate": 2.24e-06,
967
- "loss": 0.0002,
968
- "step": 4000
969
- },
970
- {
971
- "epoch": 67.08,
972
- "learning_rate": 2.188888888888889e-06,
973
- "loss": 0.0002,
974
- "step": 4025
975
- },
976
- {
977
- "epoch": 67.5,
978
- "learning_rate": 2.133333333333334e-06,
979
- "loss": 0.0002,
980
- "step": 4050
981
- },
982
- {
983
- "epoch": 67.92,
984
- "learning_rate": 2.077777777777778e-06,
985
- "loss": 0.0002,
986
- "step": 4075
987
- },
988
- {
989
- "epoch": 68.33,
990
- "learning_rate": 2.0222222222222223e-06,
991
- "loss": 0.0002,
992
- "step": 4100
993
- },
994
- {
995
- "epoch": 68.75,
996
- "learning_rate": 1.9666666666666668e-06,
997
- "loss": 0.0002,
998
- "step": 4125
999
- },
1000
- {
1001
- "epoch": 69.17,
1002
- "learning_rate": 1.9111111111111112e-06,
1003
- "loss": 0.0002,
1004
- "step": 4150
1005
- },
1006
- {
1007
- "epoch": 69.58,
1008
- "learning_rate": 1.8555555555555557e-06,
1009
- "loss": 0.0002,
1010
- "step": 4175
1011
- },
1012
- {
1013
- "epoch": 70.0,
1014
- "learning_rate": 1.8000000000000001e-06,
1015
- "loss": 0.0002,
1016
- "step": 4200
1017
- },
1018
- {
1019
- "epoch": 70.42,
1020
- "learning_rate": 1.7444444444444448e-06,
1021
- "loss": 0.0002,
1022
- "step": 4225
1023
- },
1024
- {
1025
- "epoch": 70.83,
1026
- "learning_rate": 1.688888888888889e-06,
1027
- "loss": 0.0002,
1028
- "step": 4250
1029
- },
1030
- {
1031
- "epoch": 71.25,
1032
- "learning_rate": 1.6333333333333335e-06,
1033
- "loss": 0.0002,
1034
- "step": 4275
1035
- },
1036
- {
1037
- "epoch": 71.67,
1038
- "learning_rate": 1.5777777777777778e-06,
1039
- "loss": 0.0002,
1040
- "step": 4300
1041
- },
1042
- {
1043
- "epoch": 72.08,
1044
- "learning_rate": 1.5222222222222224e-06,
1045
- "loss": 0.0002,
1046
- "step": 4325
1047
- },
1048
- {
1049
- "epoch": 72.5,
1050
- "learning_rate": 1.4666666666666669e-06,
1051
- "loss": 0.0002,
1052
- "step": 4350
1053
- },
1054
- {
1055
- "epoch": 72.92,
1056
- "learning_rate": 1.4111111111111111e-06,
1057
- "loss": 0.0002,
1058
- "step": 4375
1059
- },
1060
- {
1061
- "epoch": 73.33,
1062
- "learning_rate": 1.3555555555555558e-06,
1063
- "loss": 0.0002,
1064
- "step": 4400
1065
- },
1066
- {
1067
- "epoch": 73.75,
1068
- "learning_rate": 1.3e-06,
1069
- "loss": 0.0002,
1070
- "step": 4425
1071
- },
1072
- {
1073
- "epoch": 74.17,
1074
- "learning_rate": 1.2444444444444445e-06,
1075
- "loss": 0.0002,
1076
- "step": 4450
1077
- },
1078
- {
1079
- "epoch": 74.58,
1080
- "learning_rate": 1.188888888888889e-06,
1081
- "loss": 0.0002,
1082
- "step": 4475
1083
- },
1084
- {
1085
- "epoch": 75.0,
1086
- "learning_rate": 1.1333333333333334e-06,
1087
- "loss": 0.0002,
1088
- "step": 4500
1089
- },
1090
- {
1091
- "epoch": 75.42,
1092
- "learning_rate": 1.0777777777777779e-06,
1093
- "loss": 0.0002,
1094
- "step": 4525
1095
- },
1096
- {
1097
- "epoch": 75.83,
1098
- "learning_rate": 1.0222222222222223e-06,
1099
- "loss": 0.0002,
1100
- "step": 4550
1101
- },
1102
- {
1103
- "epoch": 76.25,
1104
- "learning_rate": 9.666666666666668e-07,
1105
- "loss": 0.0002,
1106
- "step": 4575
1107
- },
1108
- {
1109
- "epoch": 76.67,
1110
- "learning_rate": 9.111111111111113e-07,
1111
- "loss": 0.0002,
1112
- "step": 4600
1113
- },
1114
- {
1115
- "epoch": 77.08,
1116
- "learning_rate": 8.555555555555556e-07,
1117
- "loss": 0.0002,
1118
- "step": 4625
1119
- },
1120
- {
1121
- "epoch": 77.5,
1122
- "learning_rate": 8.000000000000001e-07,
1123
- "loss": 0.0002,
1124
- "step": 4650
1125
- },
1126
- {
1127
- "epoch": 77.92,
1128
- "learning_rate": 7.444444444444444e-07,
1129
- "loss": 0.0002,
1130
- "step": 4675
1131
- },
1132
- {
1133
- "epoch": 78.33,
1134
- "learning_rate": 6.88888888888889e-07,
1135
- "loss": 0.0002,
1136
- "step": 4700
1137
- },
1138
- {
1139
- "epoch": 78.75,
1140
- "learning_rate": 6.333333333333334e-07,
1141
- "loss": 0.0002,
1142
- "step": 4725
1143
- },
1144
- {
1145
- "epoch": 79.17,
1146
- "learning_rate": 5.777777777777778e-07,
1147
- "loss": 0.0002,
1148
- "step": 4750
1149
- },
1150
- {
1151
- "epoch": 79.58,
1152
- "learning_rate": 5.222222222222223e-07,
1153
- "loss": 0.0002,
1154
- "step": 4775
1155
- },
1156
- {
1157
- "epoch": 80.0,
1158
- "learning_rate": 4.666666666666667e-07,
1159
- "loss": 0.0002,
1160
- "step": 4800
1161
- },
1162
- {
1163
- "epoch": 80.42,
1164
- "learning_rate": 4.111111111111112e-07,
1165
- "loss": 0.0002,
1166
- "step": 4825
1167
- },
1168
- {
1169
- "epoch": 80.83,
1170
- "learning_rate": 3.555555555555556e-07,
1171
- "loss": 0.0002,
1172
- "step": 4850
1173
- },
1174
- {
1175
- "epoch": 81.25,
1176
- "learning_rate": 3.0000000000000004e-07,
1177
- "loss": 0.0002,
1178
- "step": 4875
1179
- },
1180
- {
1181
- "epoch": 81.67,
1182
- "learning_rate": 2.444444444444445e-07,
1183
- "loss": 0.0002,
1184
- "step": 4900
1185
- },
1186
- {
1187
- "epoch": 82.08,
1188
- "learning_rate": 1.888888888888889e-07,
1189
- "loss": 0.0002,
1190
- "step": 4925
1191
- },
1192
- {
1193
- "epoch": 82.5,
1194
- "learning_rate": 1.3333333333333336e-07,
1195
- "loss": 0.0002,
1196
- "step": 4950
1197
- },
1198
- {
1199
- "epoch": 82.92,
1200
- "learning_rate": 7.777777777777778e-08,
1201
- "loss": 0.0002,
1202
- "step": 4975
1203
- },
1204
- {
1205
- "epoch": 83.33,
1206
- "learning_rate": 2.2222222222222224e-08,
1207
- "loss": 0.0002,
1208
- "step": 5000
1209
- },
1210
- {
1211
- "epoch": 83.33,
1212
- "eval_loss": 0.50341796875,
1213
- "eval_runtime": 154.2054,
1214
- "eval_samples_per_second": 13.164,
1215
- "eval_steps_per_second": 0.415,
1216
- "eval_wer": 27.279438445464898,
1217
- "step": 5000
1218
- },
1219
- {
1220
- "epoch": 83.33,
1221
- "step": 5000,
1222
- "total_flos": 9.17485126904384e+19,
1223
- "train_loss": 0.031859876942634584,
1224
- "train_runtime": 20088.5298,
1225
- "train_samples_per_second": 15.929,
1226
- "train_steps_per_second": 0.249
1227
- }
1228
- ],
1229
- "max_steps": 5000,
1230
- "num_train_epochs": 84,
1231
- "total_flos": 9.17485126904384e+19,
1232
- "trial_name": null,
1233
- "trial_params": null
1234
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
training_args.bin DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:92e0d28a477d6fbed86250bc2bdb7420c074fb1d27d9be486675bd887280e570
3
- size 4352
 
 
 
 
vocab.json CHANGED
The diff for this file is too large to render. See raw diff