dbourget commited on
Commit
86e3b91
·
verified ·
1 Parent(s): cbb8e81

Add new SentenceTransformer model.

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 1024,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,725 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: dbourget/pb-small-10e-tsdae6e-philsim-cosine-3e-pt1
3
+ library_name: sentence-transformers
4
+ metrics:
5
+ - cosine_accuracy
6
+ - dot_accuracy
7
+ - manhattan_accuracy
8
+ - euclidean_accuracy
9
+ - max_accuracy
10
+ pipeline_tag: sentence-similarity
11
+ tags:
12
+ - sentence-transformers
13
+ - sentence-similarity
14
+ - feature-extraction
15
+ - generated_from_trainer
16
+ - dataset_size:9504
17
+ - loss:TripletLoss
18
+ widget:
19
+ - source_sentence: cap product
20
+ sentences:
21
+ - method of adjoining a chain of degree p with a co-chain of degree q, where q is
22
+ less than or equal to p, to form a composite chain of degree p-q
23
+ - 'Ontology '
24
+ - hat commodity
25
+ - source_sentence: cognitivism
26
+ sentences:
27
+ - supporting cognitive science
28
+ - study of changes in organisms caused by modification of gene expression rather
29
+ than alteration of the genetic code
30
+ - 'the idea that mind works like an algorithmic symbol manipulation '
31
+ - source_sentence: doxastic voluntarism
32
+ sentences:
33
+ - Land surrounded by water
34
+ - belief one is free
35
+ - the ability to will beliefs
36
+ - source_sentence: conceptual role
37
+ sentences:
38
+ - concept
39
+ - inferential role
40
+ - 'Theory of knowledge '
41
+ - source_sentence: scientific revolutions
42
+ sentences:
43
+ - scientific realism
44
+ - Universal moral principles govern legal systems
45
+ - paradigm shifts
46
+ model-index:
47
+ - name: SentenceTransformer based on dbourget/pb-small-10e-tsdae6e-philsim-cosine-3e-pt1
48
+ results:
49
+ - task:
50
+ type: triplet
51
+ name: Triplet
52
+ dataset:
53
+ name: beatai dev
54
+ type: beatai-dev
55
+ metrics:
56
+ - type: cosine_accuracy
57
+ value: 0.7929292929292929
58
+ name: Cosine Accuracy
59
+ - type: dot_accuracy
60
+ value: 0.2542087542087542
61
+ name: Dot Accuracy
62
+ - type: manhattan_accuracy
63
+ value: 0.8021885521885522
64
+ name: Manhattan Accuracy
65
+ - type: euclidean_accuracy
66
+ value: 0.8013468013468014
67
+ name: Euclidean Accuracy
68
+ - type: max_accuracy
69
+ value: 0.8021885521885522
70
+ name: Max Accuracy
71
+ ---
72
+
73
+ # SentenceTransformer based on dbourget/pb-small-10e-tsdae6e-philsim-cosine-3e-pt1
74
+
75
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [dbourget/pb-small-10e-tsdae6e-philsim-cosine-3e-pt1](https://huggingface.co/dbourget/pb-small-10e-tsdae6e-philsim-cosine-3e-pt1). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
76
+
77
+ ## Model Details
78
+
79
+ ### Model Description
80
+ - **Model Type:** Sentence Transformer
81
+ - **Base model:** [dbourget/pb-small-10e-tsdae6e-philsim-cosine-3e-pt1](https://huggingface.co/dbourget/pb-small-10e-tsdae6e-philsim-cosine-3e-pt1) <!-- at revision e3be09e156ca8e2b7b4e5d296fc50a316393eda3 -->
82
+ - **Maximum Sequence Length:** 512 tokens
83
+ - **Output Dimensionality:** 1024 tokens
84
+ - **Similarity Function:** Cosine Similarity
85
+ <!-- - **Training Dataset:** Unknown -->
86
+ <!-- - **Language:** Unknown -->
87
+ <!-- - **License:** Unknown -->
88
+
89
+ ### Model Sources
90
+
91
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
92
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
93
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
94
+
95
+ ### Full Model Architecture
96
+
97
+ ```
98
+ SentenceTransformer(
99
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
100
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
101
+ )
102
+ ```
103
+
104
+ ## Usage
105
+
106
+ ### Direct Usage (Sentence Transformers)
107
+
108
+ First install the Sentence Transformers library:
109
+
110
+ ```bash
111
+ pip install -U sentence-transformers
112
+ ```
113
+
114
+ Then you can load this model and run inference.
115
+ ```python
116
+ from sentence_transformers import SentenceTransformer
117
+
118
+ # Download from the 🤗 Hub
119
+ model = SentenceTransformer("dbourget/pb-small-10e-tsdae6e-philsim-cosine-6e-beatai-cosine-50e")
120
+ # Run inference
121
+ sentences = [
122
+ 'scientific revolutions',
123
+ 'paradigm shifts',
124
+ 'scientific realism',
125
+ ]
126
+ embeddings = model.encode(sentences)
127
+ print(embeddings.shape)
128
+ # [3, 1024]
129
+
130
+ # Get the similarity scores for the embeddings
131
+ similarities = model.similarity(embeddings, embeddings)
132
+ print(similarities.shape)
133
+ # [3, 3]
134
+ ```
135
+
136
+ <!--
137
+ ### Direct Usage (Transformers)
138
+
139
+ <details><summary>Click to see the direct usage in Transformers</summary>
140
+
141
+ </details>
142
+ -->
143
+
144
+ <!--
145
+ ### Downstream Usage (Sentence Transformers)
146
+
147
+ You can finetune this model on your own dataset.
148
+
149
+ <details><summary>Click to expand</summary>
150
+
151
+ </details>
152
+ -->
153
+
154
+ <!--
155
+ ### Out-of-Scope Use
156
+
157
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
158
+ -->
159
+
160
+ ## Evaluation
161
+
162
+ ### Metrics
163
+
164
+ #### Triplet
165
+ * Dataset: `beatai-dev`
166
+ * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
167
+
168
+ | Metric | Value |
169
+ |:--------------------|:-----------|
170
+ | **cosine_accuracy** | **0.7929** |
171
+ | dot_accuracy | 0.2542 |
172
+ | manhattan_accuracy | 0.8022 |
173
+ | euclidean_accuracy | 0.8013 |
174
+ | max_accuracy | 0.8022 |
175
+
176
+ <!--
177
+ ## Bias, Risks and Limitations
178
+
179
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
180
+ -->
181
+
182
+ <!--
183
+ ### Recommendations
184
+
185
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
186
+ -->
187
+
188
+ ## Training Details
189
+
190
+ ### Training Hyperparameters
191
+ #### Non-Default Hyperparameters
192
+
193
+ - `eval_strategy`: steps
194
+ - `per_device_train_batch_size`: 138
195
+ - `per_device_eval_batch_size`: 138
196
+ - `learning_rate`: 5e-07
197
+ - `weight_decay`: 0.01
198
+ - `num_train_epochs`: 50
199
+ - `lr_scheduler_type`: constant
200
+ - `bf16`: True
201
+ - `dataloader_drop_last`: True
202
+ - `resume_from_checkpoint`: True
203
+
204
+ #### All Hyperparameters
205
+ <details><summary>Click to expand</summary>
206
+
207
+ - `overwrite_output_dir`: False
208
+ - `do_predict`: False
209
+ - `eval_strategy`: steps
210
+ - `prediction_loss_only`: True
211
+ - `per_device_train_batch_size`: 138
212
+ - `per_device_eval_batch_size`: 138
213
+ - `per_gpu_train_batch_size`: None
214
+ - `per_gpu_eval_batch_size`: None
215
+ - `gradient_accumulation_steps`: 1
216
+ - `eval_accumulation_steps`: None
217
+ - `torch_empty_cache_steps`: None
218
+ - `learning_rate`: 5e-07
219
+ - `weight_decay`: 0.01
220
+ - `adam_beta1`: 0.9
221
+ - `adam_beta2`: 0.999
222
+ - `adam_epsilon`: 1e-08
223
+ - `max_grad_norm`: 1.0
224
+ - `num_train_epochs`: 50
225
+ - `max_steps`: -1
226
+ - `lr_scheduler_type`: constant
227
+ - `lr_scheduler_kwargs`: {}
228
+ - `warmup_ratio`: 0
229
+ - `warmup_steps`: 0
230
+ - `log_level`: passive
231
+ - `log_level_replica`: warning
232
+ - `log_on_each_node`: True
233
+ - `logging_nan_inf_filter`: True
234
+ - `save_safetensors`: True
235
+ - `save_on_each_node`: False
236
+ - `save_only_model`: False
237
+ - `restore_callback_states_from_checkpoint`: False
238
+ - `no_cuda`: False
239
+ - `use_cpu`: False
240
+ - `use_mps_device`: False
241
+ - `seed`: 42
242
+ - `data_seed`: None
243
+ - `jit_mode_eval`: False
244
+ - `use_ipex`: False
245
+ - `bf16`: True
246
+ - `fp16`: False
247
+ - `fp16_opt_level`: O1
248
+ - `half_precision_backend`: auto
249
+ - `bf16_full_eval`: False
250
+ - `fp16_full_eval`: False
251
+ - `tf32`: None
252
+ - `local_rank`: 0
253
+ - `ddp_backend`: None
254
+ - `tpu_num_cores`: None
255
+ - `tpu_metrics_debug`: False
256
+ - `debug`: []
257
+ - `dataloader_drop_last`: True
258
+ - `dataloader_num_workers`: 0
259
+ - `dataloader_prefetch_factor`: 2
260
+ - `past_index`: -1
261
+ - `disable_tqdm`: False
262
+ - `remove_unused_columns`: True
263
+ - `label_names`: None
264
+ - `load_best_model_at_end`: False
265
+ - `ignore_data_skip`: False
266
+ - `fsdp`: []
267
+ - `fsdp_min_num_params`: 0
268
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
269
+ - `fsdp_transformer_layer_cls_to_wrap`: None
270
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
271
+ - `deepspeed`: None
272
+ - `label_smoothing_factor`: 0.0
273
+ - `optim`: adamw_torch
274
+ - `optim_args`: None
275
+ - `adafactor`: False
276
+ - `group_by_length`: False
277
+ - `length_column_name`: length
278
+ - `ddp_find_unused_parameters`: None
279
+ - `ddp_bucket_cap_mb`: None
280
+ - `ddp_broadcast_buffers`: False
281
+ - `dataloader_pin_memory`: True
282
+ - `dataloader_persistent_workers`: False
283
+ - `skip_memory_metrics`: True
284
+ - `use_legacy_prediction_loop`: False
285
+ - `push_to_hub`: False
286
+ - `resume_from_checkpoint`: True
287
+ - `hub_model_id`: None
288
+ - `hub_strategy`: every_save
289
+ - `hub_private_repo`: False
290
+ - `hub_always_push`: False
291
+ - `gradient_checkpointing`: False
292
+ - `gradient_checkpointing_kwargs`: None
293
+ - `include_inputs_for_metrics`: False
294
+ - `eval_do_concat_batches`: True
295
+ - `fp16_backend`: auto
296
+ - `push_to_hub_model_id`: None
297
+ - `push_to_hub_organization`: None
298
+ - `mp_parameters`:
299
+ - `auto_find_batch_size`: False
300
+ - `full_determinism`: False
301
+ - `torchdynamo`: None
302
+ - `ray_scope`: last
303
+ - `ddp_timeout`: 1800
304
+ - `torch_compile`: False
305
+ - `torch_compile_backend`: None
306
+ - `torch_compile_mode`: None
307
+ - `dispatch_batches`: None
308
+ - `split_batches`: None
309
+ - `include_tokens_per_second`: False
310
+ - `include_num_input_tokens_seen`: False
311
+ - `neftune_noise_alpha`: None
312
+ - `optim_target_modules`: None
313
+ - `batch_eval_metrics`: False
314
+ - `eval_on_start`: False
315
+ - `use_liger_kernel`: False
316
+ - `eval_use_gather_object`: False
317
+ - `batch_sampler`: batch_sampler
318
+ - `multi_dataset_batch_sampler`: proportional
319
+
320
+ </details>
321
+
322
+ ### Training Logs
323
+ <details><summary>Click to expand</summary>
324
+
325
+ | Epoch | Step | Training Loss | loss | beatai-dev_cosine_accuracy |
326
+ |:-------:|:----:|:-------------:|:------:|:--------------------------:|
327
+ | 0 | 0 | - | - | 0.4764 |
328
+ | 0.1471 | 10 | 0.2061 | - | - |
329
+ | 0.2941 | 20 | 0.2048 | - | - |
330
+ | 0.4412 | 30 | 0.204 | - | - |
331
+ | 0.5882 | 40 | 0.202 | - | - |
332
+ | 0.7353 | 50 | 0.2019 | 0.2010 | 0.5219 |
333
+ | 0.8824 | 60 | 0.2017 | - | - |
334
+ | 1.0294 | 70 | 0.1954 | - | - |
335
+ | 1.1765 | 80 | 0.1959 | - | - |
336
+ | 1.3235 | 90 | 0.1941 | - | - |
337
+ | 1.4706 | 100 | 0.1937 | 0.1929 | 0.5598 |
338
+ | 1.6176 | 110 | 0.1923 | - | - |
339
+ | 1.7647 | 120 | 0.1893 | - | - |
340
+ | 1.9118 | 130 | 0.1861 | - | - |
341
+ | 2.0588 | 140 | 0.1842 | - | - |
342
+ | 2.2059 | 150 | 0.1818 | 0.1814 | 0.5985 |
343
+ | 2.3529 | 160 | 0.1834 | - | - |
344
+ | 2.5 | 170 | 0.1729 | - | - |
345
+ | 2.6471 | 180 | 0.1726 | - | - |
346
+ | 2.7941 | 190 | 0.1668 | - | - |
347
+ | 2.9412 | 200 | 0.1622 | 0.1653 | 0.6330 |
348
+ | 3.0882 | 210 | 0.1604 | - | - |
349
+ | 3.2353 | 220 | 0.1572 | - | - |
350
+ | 3.3824 | 230 | 0.159 | - | - |
351
+ | 3.5294 | 240 | 0.1567 | - | - |
352
+ | 3.6765 | 250 | 0.1481 | 0.1562 | 0.6532 |
353
+ | 3.8235 | 260 | 0.148 | - | - |
354
+ | 3.9706 | 270 | 0.1492 | - | - |
355
+ | 4.1176 | 280 | 0.1528 | - | - |
356
+ | 4.2647 | 290 | 0.1437 | - | - |
357
+ | 4.4118 | 300 | 0.1481 | 0.1490 | 0.6658 |
358
+ | 4.5588 | 310 | 0.1386 | - | - |
359
+ | 4.7059 | 320 | 0.1413 | - | - |
360
+ | 4.8529 | 330 | 0.1407 | - | - |
361
+ | 5.0 | 340 | 0.1387 | - | - |
362
+ | 5.1471 | 350 | 0.1423 | 0.1438 | 0.6717 |
363
+ | 5.2941 | 360 | 0.1376 | - | - |
364
+ | 5.4412 | 370 | 0.1314 | - | - |
365
+ | 5.5882 | 380 | 0.1416 | - | - |
366
+ | 5.7353 | 390 | 0.1284 | - | - |
367
+ | 5.8824 | 400 | 0.1375 | 0.1394 | 0.6801 |
368
+ | 6.0294 | 410 | 0.1308 | - | - |
369
+ | 6.1765 | 420 | 0.1286 | - | - |
370
+ | 6.3235 | 430 | 0.1326 | - | - |
371
+ | 6.4706 | 440 | 0.1356 | - | - |
372
+ | 6.6176 | 450 | 0.1298 | 0.1361 | 0.6877 |
373
+ | 6.7647 | 460 | 0.1242 | - | - |
374
+ | 6.9118 | 470 | 0.1299 | - | - |
375
+ | 7.0588 | 480 | 0.1279 | - | - |
376
+ | 7.2059 | 490 | 0.1234 | - | - |
377
+ | 7.3529 | 500 | 0.1298 | 0.1333 | 0.7045 |
378
+ | 7.5 | 510 | 0.1252 | - | - |
379
+ | 7.6471 | 520 | 0.1248 | - | - |
380
+ | 7.7941 | 530 | 0.1241 | - | - |
381
+ | 7.9412 | 540 | 0.126 | - | - |
382
+ | 8.0882 | 550 | 0.1252 | 0.1316 | 0.7071 |
383
+ | 8.2353 | 560 | 0.1237 | - | - |
384
+ | 8.3824 | 570 | 0.1205 | - | - |
385
+ | 8.5294 | 580 | 0.1195 | - | - |
386
+ | 8.6765 | 590 | 0.1187 | - | - |
387
+ | 8.8235 | 600 | 0.1187 | 0.1293 | 0.7138 |
388
+ | 8.9706 | 610 | 0.1269 | - | - |
389
+ | 9.1176 | 620 | 0.1261 | - | - |
390
+ | 9.2647 | 630 | 0.1182 | - | - |
391
+ | 9.4118 | 640 | 0.1219 | - | - |
392
+ | 9.5588 | 650 | 0.1173 | 0.1276 | 0.7172 |
393
+ | 9.7059 | 660 | 0.1182 | - | - |
394
+ | 9.8529 | 670 | 0.122 | - | - |
395
+ | 10.0 | 680 | 0.1179 | - | - |
396
+ | 10.1471 | 690 | 0.1137 | - | - |
397
+ | 10.2941 | 700 | 0.1248 | 0.1261 | 0.7247 |
398
+ | 10.4412 | 710 | 0.1162 | - | - |
399
+ | 10.5882 | 720 | 0.1166 | - | - |
400
+ | 10.7353 | 730 | 0.1111 | - | - |
401
+ | 10.8824 | 740 | 0.115 | - | - |
402
+ | 11.0294 | 750 | 0.1175 | 0.1247 | 0.7298 |
403
+ | 11.1765 | 760 | 0.1136 | - | - |
404
+ | 11.3235 | 770 | 0.1172 | - | - |
405
+ | 11.4706 | 780 | 0.1158 | - | - |
406
+ | 11.6176 | 790 | 0.1142 | - | - |
407
+ | 11.7647 | 800 | 0.1097 | 0.1236 | 0.7332 |
408
+ | 11.9118 | 810 | 0.1161 | - | - |
409
+ | 12.0588 | 820 | 0.1153 | - | - |
410
+ | 12.2059 | 830 | 0.1114 | - | - |
411
+ | 12.3529 | 840 | 0.1133 | - | - |
412
+ | 12.5 | 850 | 0.1104 | 0.1226 | 0.7332 |
413
+ | 12.6471 | 860 | 0.1093 | - | - |
414
+ | 12.7941 | 870 | 0.1157 | - | - |
415
+ | 12.9412 | 880 | 0.1127 | - | - |
416
+ | 13.0882 | 890 | 0.1115 | - | - |
417
+ | 13.2353 | 900 | 0.1109 | 0.1214 | 0.7323 |
418
+ | 13.3824 | 910 | 0.1125 | - | - |
419
+ | 13.5294 | 920 | 0.1097 | - | - |
420
+ | 13.6765 | 930 | 0.1124 | - | - |
421
+ | 13.8235 | 940 | 0.114 | - | - |
422
+ | 13.9706 | 950 | 0.11 | 0.1204 | 0.7382 |
423
+ | 14.1176 | 960 | 0.1049 | - | - |
424
+ | 14.2647 | 970 | 0.1128 | - | - |
425
+ | 14.4118 | 980 | 0.1109 | - | - |
426
+ | 14.5588 | 990 | 0.1087 | - | - |
427
+ | 14.7059 | 1000 | 0.1079 | 0.1196 | 0.7382 |
428
+ | 14.8529 | 1010 | 0.1077 | - | - |
429
+ | 15.0 | 1020 | 0.1061 | - | - |
430
+ | 15.1471 | 1030 | 0.1101 | - | - |
431
+ | 15.2941 | 1040 | 0.1087 | - | - |
432
+ | 15.4412 | 1050 | 0.106 | 0.1186 | 0.7399 |
433
+ | 15.5882 | 1060 | 0.1047 | - | - |
434
+ | 15.7353 | 1070 | 0.1048 | - | - |
435
+ | 15.8824 | 1080 | 0.103 | - | - |
436
+ | 16.0294 | 1090 | 0.1064 | - | - |
437
+ | 16.1765 | 1100 | 0.1029 | 0.1179 | 0.7433 |
438
+ | 16.3235 | 1110 | 0.1033 | - | - |
439
+ | 16.4706 | 1120 | 0.1066 | - | - |
440
+ | 16.6176 | 1130 | 0.1095 | - | - |
441
+ | 16.7647 | 1140 | 0.1031 | - | - |
442
+ | 16.9118 | 1150 | 0.1 | 0.1172 | 0.7466 |
443
+ | 17.0588 | 1160 | 0.1056 | - | - |
444
+ | 17.2059 | 1170 | 0.1033 | - | - |
445
+ | 17.3529 | 1180 | 0.102 | - | - |
446
+ | 17.5 | 1190 | 0.1083 | - | - |
447
+ | 17.6471 | 1200 | 0.0971 | 0.1164 | 0.7458 |
448
+ | 17.7941 | 1210 | 0.1016 | - | - |
449
+ | 17.9412 | 1220 | 0.1033 | - | - |
450
+ | 18.0882 | 1230 | 0.0987 | - | - |
451
+ | 18.2353 | 1240 | 0.1062 | - | - |
452
+ | 18.3824 | 1250 | 0.0925 | 0.1157 | 0.7475 |
453
+ | 18.5294 | 1260 | 0.1028 | - | - |
454
+ | 18.6765 | 1270 | 0.1012 | - | - |
455
+ | 18.8235 | 1280 | 0.1027 | - | - |
456
+ | 18.9706 | 1290 | 0.1026 | - | - |
457
+ | 19.1176 | 1300 | 0.1023 | 0.1148 | 0.7508 |
458
+ | 19.2647 | 1310 | 0.1053 | - | - |
459
+ | 19.4118 | 1320 | 0.0981 | - | - |
460
+ | 19.5588 | 1330 | 0.0975 | - | - |
461
+ | 19.7059 | 1340 | 0.1006 | - | - |
462
+ | 19.8529 | 1350 | 0.0991 | 0.1141 | 0.7508 |
463
+ | 20.0 | 1360 | 0.0994 | - | - |
464
+ | 20.1471 | 1370 | 0.0998 | - | - |
465
+ | 20.2941 | 1380 | 0.1014 | - | - |
466
+ | 20.4412 | 1390 | 0.0986 | - | - |
467
+ | 20.5882 | 1400 | 0.098 | 0.1133 | 0.7525 |
468
+ | 20.7353 | 1410 | 0.101 | - | - |
469
+ | 20.8824 | 1420 | 0.098 | - | - |
470
+ | 21.0294 | 1430 | 0.1041 | - | - |
471
+ | 21.1765 | 1440 | 0.0979 | - | - |
472
+ | 21.3235 | 1450 | 0.1006 | 0.1126 | 0.7559 |
473
+ | 21.4706 | 1460 | 0.097 | - | - |
474
+ | 21.6176 | 1470 | 0.0985 | - | - |
475
+ | 21.7647 | 1480 | 0.0956 | - | - |
476
+ | 21.9118 | 1490 | 0.0993 | - | - |
477
+ | 22.0588 | 1500 | 0.0943 | 0.1120 | 0.7551 |
478
+ | 22.2059 | 1510 | 0.0977 | - | - |
479
+ | 22.3529 | 1520 | 0.0998 | - | - |
480
+ | 22.5 | 1530 | 0.0977 | - | - |
481
+ | 22.6471 | 1540 | 0.099 | - | - |
482
+ | 22.7941 | 1550 | 0.0925 | 0.1113 | 0.7576 |
483
+ | 22.9412 | 1560 | 0.0929 | - | - |
484
+ | 23.0882 | 1570 | 0.0965 | - | - |
485
+ | 23.2353 | 1580 | 0.0896 | - | - |
486
+ | 23.3824 | 1590 | 0.0993 | - | - |
487
+ | 23.5294 | 1600 | 0.0941 | 0.1109 | 0.7576 |
488
+ | 23.6765 | 1610 | 0.0927 | - | - |
489
+ | 23.8235 | 1620 | 0.0994 | - | - |
490
+ | 23.9706 | 1630 | 0.0956 | - | - |
491
+ | 24.1176 | 1640 | 0.0947 | - | - |
492
+ | 24.2647 | 1650 | 0.0927 | 0.1103 | 0.7576 |
493
+ | 24.4118 | 1660 | 0.0935 | - | - |
494
+ | 24.5588 | 1670 | 0.0996 | - | - |
495
+ | 24.7059 | 1680 | 0.0903 | - | - |
496
+ | 24.8529 | 1690 | 0.0916 | - | - |
497
+ | 25.0 | 1700 | 0.0951 | 0.1096 | 0.7584 |
498
+ | 25.1471 | 1710 | 0.0924 | - | - |
499
+ | 25.2941 | 1720 | 0.0952 | - | - |
500
+ | 25.4412 | 1730 | 0.0954 | - | - |
501
+ | 25.5882 | 1740 | 0.0968 | - | - |
502
+ | 25.7353 | 1750 | 0.0942 | 0.1090 | 0.7593 |
503
+ | 25.8824 | 1760 | 0.0913 | - | - |
504
+ | 26.0294 | 1770 | 0.0931 | - | - |
505
+ | 26.1765 | 1780 | 0.0872 | - | - |
506
+ | 26.3235 | 1790 | 0.0915 | - | - |
507
+ | 26.4706 | 1800 | 0.0937 | 0.1085 | 0.7601 |
508
+ | 26.6176 | 1810 | 0.0971 | - | - |
509
+ | 26.7647 | 1820 | 0.0944 | - | - |
510
+ | 26.9118 | 1830 | 0.0908 | - | - |
511
+ | 27.0588 | 1840 | 0.089 | - | - |
512
+ | 27.2059 | 1850 | 0.0944 | 0.1082 | 0.7626 |
513
+ | 27.3529 | 1860 | 0.0926 | - | - |
514
+ | 27.5 | 1870 | 0.087 | - | - |
515
+ | 27.6471 | 1880 | 0.0904 | - | - |
516
+ | 27.7941 | 1890 | 0.0886 | - | - |
517
+ | 27.9412 | 1900 | 0.0942 | 0.1077 | 0.7635 |
518
+ | 28.0882 | 1910 | 0.0947 | - | - |
519
+ | 28.2353 | 1920 | 0.0857 | - | - |
520
+ | 28.3824 | 1930 | 0.0908 | - | - |
521
+ | 28.5294 | 1940 | 0.0943 | - | - |
522
+ | 28.6765 | 1950 | 0.0902 | 0.1071 | 0.7668 |
523
+ | 28.8235 | 1960 | 0.0909 | - | - |
524
+ | 28.9706 | 1970 | 0.0897 | - | - |
525
+ | 29.1176 | 1980 | 0.0924 | - | - |
526
+ | 29.2647 | 1990 | 0.0909 | - | - |
527
+ | 29.4118 | 2000 | 0.0895 | 0.1066 | 0.7652 |
528
+ | 29.5588 | 2010 | 0.0832 | - | - |
529
+ | 29.7059 | 2020 | 0.0883 | - | - |
530
+ | 29.8529 | 2030 | 0.0935 | - | - |
531
+ | 30.0 | 2040 | 0.09 | - | - |
532
+ | 30.1471 | 2050 | 0.0891 | 0.1060 | 0.7677 |
533
+ | 30.2941 | 2060 | 0.0978 | - | - |
534
+ | 30.4412 | 2070 | 0.0894 | - | - |
535
+ | 30.5882 | 2080 | 0.0893 | - | - |
536
+ | 30.7353 | 2090 | 0.0815 | - | - |
537
+ | 30.8824 | 2100 | 0.0889 | 0.1058 | 0.7660 |
538
+ | 31.0294 | 2110 | 0.0801 | - | - |
539
+ | 31.1765 | 2120 | 0.0922 | - | - |
540
+ | 31.3235 | 2130 | 0.0868 | - | - |
541
+ | 31.4706 | 2140 | 0.0858 | - | - |
542
+ | 31.6176 | 2150 | 0.0862 | 0.1055 | 0.7685 |
543
+ | 31.7647 | 2160 | 0.0861 | - | - |
544
+ | 31.9118 | 2170 | 0.0896 | - | - |
545
+ | 32.0588 | 2180 | 0.0877 | - | - |
546
+ | 32.2059 | 2190 | 0.0864 | - | - |
547
+ | 32.3529 | 2200 | 0.0921 | 0.1050 | 0.7694 |
548
+ | 32.5 | 2210 | 0.082 | - | - |
549
+ | 32.6471 | 2220 | 0.0902 | - | - |
550
+ | 32.7941 | 2230 | 0.0825 | - | - |
551
+ | 32.9412 | 2240 | 0.0829 | - | - |
552
+ | 33.0882 | 2250 | 0.0859 | 0.1046 | 0.7694 |
553
+ | 33.2353 | 2260 | 0.0847 | - | - |
554
+ | 33.3824 | 2270 | 0.0829 | - | - |
555
+ | 33.5294 | 2280 | 0.0841 | - | - |
556
+ | 33.6765 | 2290 | 0.0833 | - | - |
557
+ | 33.8235 | 2300 | 0.0899 | 0.1042 | 0.7710 |
558
+ | 33.9706 | 2310 | 0.0789 | - | - |
559
+ | 34.1176 | 2320 | 0.0809 | - | - |
560
+ | 34.2647 | 2330 | 0.0835 | - | - |
561
+ | 34.4118 | 2340 | 0.0816 | - | - |
562
+ | 34.5588 | 2350 | 0.0803 | 0.1038 | 0.7744 |
563
+ | 34.7059 | 2360 | 0.0808 | - | - |
564
+ | 34.8529 | 2370 | 0.0867 | - | - |
565
+ | 35.0 | 2380 | 0.0878 | - | - |
566
+ | 35.1471 | 2390 | 0.0869 | - | - |
567
+ | 35.2941 | 2400 | 0.0785 | 0.1034 | 0.7753 |
568
+ | 35.4412 | 2410 | 0.0849 | - | - |
569
+ | 35.5882 | 2420 | 0.0832 | - | - |
570
+ | 35.7353 | 2430 | 0.0799 | - | - |
571
+ | 35.8824 | 2440 | 0.0813 | - | - |
572
+ | 36.0294 | 2450 | 0.0801 | 0.1029 | 0.7753 |
573
+ | 36.1765 | 2460 | 0.0771 | - | - |
574
+ | 36.3235 | 2470 | 0.0828 | - | - |
575
+ | 36.4706 | 2480 | 0.0837 | - | - |
576
+ | 36.6176 | 2490 | 0.0774 | - | - |
577
+ | 36.7647 | 2500 | 0.0822 | 0.1026 | 0.7769 |
578
+ | 36.9118 | 2510 | 0.0845 | - | - |
579
+ | 37.0588 | 2520 | 0.0882 | - | - |
580
+ | 37.2059 | 2530 | 0.0802 | - | - |
581
+ | 37.3529 | 2540 | 0.0806 | - | - |
582
+ | 37.5 | 2550 | 0.0809 | 0.1022 | 0.7795 |
583
+ | 37.6471 | 2560 | 0.0806 | - | - |
584
+ | 37.7941 | 2570 | 0.0788 | - | - |
585
+ | 37.9412 | 2580 | 0.0858 | - | - |
586
+ | 38.0882 | 2590 | 0.0791 | - | - |
587
+ | 38.2353 | 2600 | 0.0842 | 0.1018 | 0.7795 |
588
+ | 38.3824 | 2610 | 0.0799 | - | - |
589
+ | 38.5294 | 2620 | 0.0769 | - | - |
590
+ | 38.6765 | 2630 | 0.0823 | - | - |
591
+ | 38.8235 | 2640 | 0.0784 | - | - |
592
+ | 38.9706 | 2650 | 0.0863 | 0.1016 | 0.7795 |
593
+ | 39.1176 | 2660 | 0.0751 | - | - |
594
+ | 39.2647 | 2670 | 0.0847 | - | - |
595
+ | 39.4118 | 2680 | 0.0784 | - | - |
596
+ | 39.5588 | 2690 | 0.0799 | - | - |
597
+ | 39.7059 | 2700 | 0.0771 | 0.1013 | 0.7811 |
598
+ | 39.8529 | 2710 | 0.0763 | - | - |
599
+ | 40.0 | 2720 | 0.0783 | - | - |
600
+ | 40.1471 | 2730 | 0.0784 | - | - |
601
+ | 40.2941 | 2740 | 0.0761 | - | - |
602
+ | 40.4412 | 2750 | 0.0797 | 0.1011 | 0.7837 |
603
+ | 40.5882 | 2760 | 0.0809 | - | - |
604
+ | 40.7353 | 2770 | 0.0758 | - | - |
605
+ | 40.8824 | 2780 | 0.0777 | - | - |
606
+ | 41.0294 | 2790 | 0.0777 | - | - |
607
+ | 41.1765 | 2800 | 0.0806 | 0.1006 | 0.7786 |
608
+ | 41.3235 | 2810 | 0.0852 | - | - |
609
+ | 41.4706 | 2820 | 0.079 | - | - |
610
+ | 41.6176 | 2830 | 0.0749 | - | - |
611
+ | 41.7647 | 2840 | 0.0805 | - | - |
612
+ | 41.9118 | 2850 | 0.0779 | 0.1003 | 0.7854 |
613
+ | 42.0588 | 2860 | 0.0759 | - | - |
614
+ | 42.2059 | 2870 | 0.0794 | - | - |
615
+ | 42.3529 | 2880 | 0.0811 | - | - |
616
+ | 42.5 | 2890 | 0.0772 | - | - |
617
+ | 42.6471 | 2900 | 0.0757 | 0.1001 | 0.7828 |
618
+ | 42.7941 | 2910 | 0.0781 | - | - |
619
+ | 42.9412 | 2920 | 0.0751 | - | - |
620
+ | 43.0882 | 2930 | 0.0752 | - | - |
621
+ | 43.2353 | 2940 | 0.079 | - | - |
622
+ | 43.3824 | 2950 | 0.076 | 0.0997 | 0.7811 |
623
+ | 43.5294 | 2960 | 0.0783 | - | - |
624
+ | 43.6765 | 2970 | 0.0774 | - | - |
625
+ | 43.8235 | 2980 | 0.07 | - | - |
626
+ | 43.9706 | 2990 | 0.073 | - | - |
627
+ | 44.1176 | 3000 | 0.0762 | 0.0993 | 0.7854 |
628
+ | 44.2647 | 3010 | 0.0749 | - | - |
629
+ | 44.4118 | 3020 | 0.0782 | - | - |
630
+ | 44.5588 | 3030 | 0.0764 | - | - |
631
+ | 44.7059 | 3040 | 0.0759 | - | - |
632
+ | 44.8529 | 3050 | 0.0769 | 0.0991 | 0.7887 |
633
+ | 45.0 | 3060 | 0.0754 | - | - |
634
+ | 45.1471 | 3070 | 0.0744 | - | - |
635
+ | 45.2941 | 3080 | 0.0767 | - | - |
636
+ | 45.4412 | 3090 | 0.0724 | - | - |
637
+ | 45.5882 | 3100 | 0.0742 | 0.0989 | 0.7870 |
638
+ | 45.7353 | 3110 | 0.0745 | - | - |
639
+ | 45.8824 | 3120 | 0.076 | - | - |
640
+ | 46.0294 | 3130 | 0.0666 | - | - |
641
+ | 46.1765 | 3140 | 0.0801 | - | - |
642
+ | 46.3235 | 3150 | 0.0734 | 0.0985 | 0.7887 |
643
+ | 46.4706 | 3160 | 0.0703 | - | - |
644
+ | 46.6176 | 3170 | 0.0772 | - | - |
645
+ | 46.7647 | 3180 | 0.0763 | - | - |
646
+ | 46.9118 | 3190 | 0.0718 | - | - |
647
+ | 47.0588 | 3200 | 0.0724 | 0.0981 | 0.7904 |
648
+ | 47.2059 | 3210 | 0.0755 | - | - |
649
+ | 47.3529 | 3220 | 0.0719 | - | - |
650
+ | 47.5 | 3230 | 0.0742 | - | - |
651
+ | 47.6471 | 3240 | 0.074 | - | - |
652
+ | 47.7941 | 3250 | 0.0758 | 0.0980 | 0.7921 |
653
+ | 47.9412 | 3260 | 0.0727 | - | - |
654
+ | 48.0882 | 3270 | 0.0676 | - | - |
655
+ | 48.2353 | 3280 | 0.0791 | - | - |
656
+ | 48.3824 | 3290 | 0.0751 | - | - |
657
+ | 48.5294 | 3300 | 0.075 | 0.0977 | 0.7887 |
658
+ | 48.6765 | 3310 | 0.0738 | - | - |
659
+ | 48.8235 | 3320 | 0.0689 | - | - |
660
+ | 48.9706 | 3330 | 0.0706 | - | - |
661
+ | 49.1176 | 3340 | 0.0671 | - | - |
662
+ | 49.2647 | 3350 | 0.0744 | 0.0974 | 0.7971 |
663
+ | 49.4118 | 3360 | 0.0739 | - | - |
664
+ | 49.5588 | 3370 | 0.0721 | - | - |
665
+ | 49.7059 | 3380 | 0.073 | - | - |
666
+ | 49.8529 | 3390 | 0.0707 | - | - |
667
+ | 50.0 | 3400 | 0.0689 | 0.0972 | 0.7929 |
668
+
669
+ </details>
670
+
671
+ ### Framework Versions
672
+ - Python: 3.8.18
673
+ - Sentence Transformers: 3.1.1
674
+ - Transformers: 4.45.1
675
+ - PyTorch: 1.13.1+cu117
676
+ - Accelerate: 0.34.2
677
+ - Datasets: 3.0.0
678
+ - Tokenizers: 0.20.0
679
+
680
+ ## Citation
681
+
682
+ ### BibTeX
683
+
684
+ #### Sentence Transformers
685
+ ```bibtex
686
+ @inproceedings{reimers-2019-sentence-bert,
687
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
688
+ author = "Reimers, Nils and Gurevych, Iryna",
689
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
690
+ month = "11",
691
+ year = "2019",
692
+ publisher = "Association for Computational Linguistics",
693
+ url = "https://arxiv.org/abs/1908.10084",
694
+ }
695
+ ```
696
+
697
+ #### TripletLoss
698
+ ```bibtex
699
+ @misc{hermans2017defense,
700
+ title={In Defense of the Triplet Loss for Person Re-Identification},
701
+ author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
702
+ year={2017},
703
+ eprint={1703.07737},
704
+ archivePrefix={arXiv},
705
+ primaryClass={cs.CV}
706
+ }
707
+ ```
708
+
709
+ <!--
710
+ ## Glossary
711
+
712
+ *Clearly define terms in order to be accessible across audiences.*
713
+ -->
714
+
715
+ <!--
716
+ ## Model Card Authors
717
+
718
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
719
+ -->
720
+
721
+ <!--
722
+ ## Model Card Contact
723
+
724
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
725
+ -->
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "dbourget/pb-small-10e-tsdae6e-philsim-cosine-3e-pt1",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 3072,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 12,
17
+ "num_hidden_layers": 12,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "tokenizer_class": "PreTrainedTokenizerFast",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.45.1",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.1.1",
4
+ "transformers": "4.45.1",
5
+ "pytorch": "1.13.1+cu117"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": null
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1a01319756c4c10f61d571b5f19638b81955c7c9eb42f0dd7356388a217fa78a
3
+ size 437951328
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "[PAD]",
4
+ "[UNK]",
5
+ "[CLS]",
6
+ "[SEP]",
7
+ "[MASK]"
8
+ ],
9
+ "cls_token": {
10
+ "content": "[CLS]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "mask_token": {
17
+ "content": "[MASK]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "pad_token": {
24
+ "content": "[PAD]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "sep_token": {
31
+ "content": "[SEP]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "unk_token": {
38
+ "content": "[UNK]",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ }
44
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "4": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "additional_special_tokens": [
45
+ "[PAD]",
46
+ "[UNK]",
47
+ "[CLS]",
48
+ "[SEP]",
49
+ "[MASK]"
50
+ ],
51
+ "clean_up_tokenization_spaces": true,
52
+ "cls_token": "[CLS]",
53
+ "mask_token": "[MASK]",
54
+ "max_length": 512,
55
+ "model_max_length": 512,
56
+ "pad_to_multiple_of": null,
57
+ "pad_token": "[PAD]",
58
+ "pad_token_type_id": 0,
59
+ "padding_side": "right",
60
+ "sep_token": "[SEP]",
61
+ "stride": 0,
62
+ "tokenizer_class": "PreTrainedTokenizerFast",
63
+ "truncation_side": "right",
64
+ "truncation_strategy": "longest_first",
65
+ "unk_token": "[UNK]"
66
+ }