CocoRoF commited on
Commit
8ea8efa
·
verified ·
1 Parent(s): bb182c7

CocoRoF/ModernBERT-SimCSE-multitask_v03-retry

Browse files
2_Dense/model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2fa0062d6d38c9ca7ccf5338c945d80b51ec0d3a19ce30227bc0a04f4581b231
3
  size 3149984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ab1b6c22d3c814406c6f3188dfcacf642adf35b296791ccbffdbe2b43a7be833
3
  size 3149984
README.md CHANGED
@@ -58,34 +58,34 @@ model-index:
58
  type: sts_dev
59
  metrics:
60
  - type: pearson_cosine
61
- value: 0.76867327483615
62
  name: Pearson Cosine
63
  - type: spearman_cosine
64
- value: 0.769585744296376
65
  name: Spearman Cosine
66
  - type: pearson_euclidean
67
- value: 0.7124057303081486
68
  name: Pearson Euclidean
69
  - type: spearman_euclidean
70
- value: 0.7051149677418829
71
  name: Spearman Euclidean
72
  - type: pearson_manhattan
73
- value: 0.7142558316733773
74
  name: Pearson Manhattan
75
  - type: spearman_manhattan
76
- value: 0.7078418919542185
77
  name: Spearman Manhattan
78
  - type: pearson_dot
79
- value: 0.6760448465343732
80
  name: Pearson Dot
81
  - type: spearman_dot
82
- value: 0.6653382275787475
83
  name: Spearman Dot
84
  - type: pearson_max
85
- value: 0.76867327483615
86
  name: Pearson Max
87
  - type: spearman_max
88
- value: 0.769585744296376
89
  name: Spearman Max
90
  ---
91
 
@@ -186,18 +186,18 @@ You can finetune this model on your own dataset.
186
  * Dataset: `sts_dev`
187
  * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
188
 
189
- | Metric | Value |
190
- |:-------------------|:-----------|
191
- | pearson_cosine | 0.7687 |
192
- | spearman_cosine | 0.7696 |
193
- | pearson_euclidean | 0.7124 |
194
- | spearman_euclidean | 0.7051 |
195
- | pearson_manhattan | 0.7143 |
196
- | spearman_manhattan | 0.7078 |
197
- | pearson_dot | 0.676 |
198
- | spearman_dot | 0.6653 |
199
- | pearson_max | 0.7687 |
200
- | **spearman_max** | **0.7696** |
201
 
202
  <!--
203
  ## Bias, Risks and Limitations
@@ -266,10 +266,11 @@ You can finetune this model on your own dataset.
266
 
267
  - `overwrite_output_dir`: True
268
  - `eval_strategy`: steps
269
- - `per_device_train_batch_size`: 16
270
- - `per_device_eval_batch_size`: 16
271
- - `gradient_accumulation_steps`: 8
272
  - `learning_rate`: 8e-05
 
273
  - `warmup_ratio`: 0.2
274
  - `push_to_hub`: True
275
  - `hub_model_id`: CocoRoF/ModernBERT-SimCSE-multitask_v03-retry
@@ -283,11 +284,11 @@ You can finetune this model on your own dataset.
283
  - `do_predict`: False
284
  - `eval_strategy`: steps
285
  - `prediction_loss_only`: True
286
- - `per_device_train_batch_size`: 16
287
- - `per_device_eval_batch_size`: 16
288
  - `per_gpu_train_batch_size`: None
289
  - `per_gpu_eval_batch_size`: None
290
- - `gradient_accumulation_steps`: 8
291
  - `eval_accumulation_steps`: None
292
  - `torch_empty_cache_steps`: None
293
  - `learning_rate`: 8e-05
@@ -296,7 +297,7 @@ You can finetune this model on your own dataset.
296
  - `adam_beta2`: 0.999
297
  - `adam_epsilon`: 1e-08
298
  - `max_grad_norm`: 1.0
299
- - `num_train_epochs`: 3.0
300
  - `max_steps`: -1
301
  - `lr_scheduler_type`: linear
302
  - `lr_scheduler_kwargs`: {}
@@ -400,12 +401,94 @@ You can finetune this model on your own dataset.
400
  ### Training Logs
401
  | Epoch | Step | Training Loss | Validation Loss | sts_dev_spearman_max |
402
  |:------:|:----:|:-------------:|:---------------:|:--------------------:|
403
- | 0.5455 | 3 | - | 0.0373 | 0.7480 |
404
- | 1.0 | 6 | - | 0.0368 | 0.7514 |
405
- | 1.5455 | 9 | - | 0.0360 | 0.7563 |
406
- | 1.7273 | 10 | 0.3102 | - | - |
407
- | 2.0 | 12 | - | 0.0352 | 0.7631 |
408
- | 2.5455 | 15 | - | 0.0344 | 0.7696 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
409
 
410
 
411
  ### Framework Versions
 
58
  type: sts_dev
59
  metrics:
60
  - type: pearson_cosine
61
+ value: 0.7885728442437165
62
  name: Pearson Cosine
63
  - type: spearman_cosine
64
+ value: 0.7890106880187878
65
  name: Spearman Cosine
66
  - type: pearson_euclidean
67
+ value: 0.7209624590910948
68
  name: Pearson Euclidean
69
  - type: spearman_euclidean
70
+ value: 0.7132906703480484
71
  name: Spearman Euclidean
72
  - type: pearson_manhattan
73
+ value: 0.7228003273015342
74
  name: Pearson Manhattan
75
  - type: spearman_manhattan
76
+ value: 0.7161151111265872
77
  name: Spearman Manhattan
78
  - type: pearson_dot
79
+ value: 0.7119673656141701
80
  name: Pearson Dot
81
  - type: spearman_dot
82
+ value: 0.7059066541365785
83
  name: Spearman Dot
84
  - type: pearson_max
85
+ value: 0.7885728442437165
86
  name: Pearson Max
87
  - type: spearman_max
88
+ value: 0.7890106880187878
89
  name: Spearman Max
90
  ---
91
 
 
186
  * Dataset: `sts_dev`
187
  * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
188
 
189
+ | Metric | Value |
190
+ |:-------------------|:----------|
191
+ | pearson_cosine | 0.7886 |
192
+ | spearman_cosine | 0.789 |
193
+ | pearson_euclidean | 0.721 |
194
+ | spearman_euclidean | 0.7133 |
195
+ | pearson_manhattan | 0.7228 |
196
+ | spearman_manhattan | 0.7161 |
197
+ | pearson_dot | 0.712 |
198
+ | spearman_dot | 0.7059 |
199
+ | pearson_max | 0.7886 |
200
+ | **spearman_max** | **0.789** |
201
 
202
  <!--
203
  ## Bias, Risks and Limitations
 
266
 
267
  - `overwrite_output_dir`: True
268
  - `eval_strategy`: steps
269
+ - `per_device_train_batch_size`: 1
270
+ - `per_device_eval_batch_size`: 1
271
+ - `gradient_accumulation_steps`: 16
272
  - `learning_rate`: 8e-05
273
+ - `num_train_epochs`: 10.0
274
  - `warmup_ratio`: 0.2
275
  - `push_to_hub`: True
276
  - `hub_model_id`: CocoRoF/ModernBERT-SimCSE-multitask_v03-retry
 
284
  - `do_predict`: False
285
  - `eval_strategy`: steps
286
  - `prediction_loss_only`: True
287
+ - `per_device_train_batch_size`: 1
288
+ - `per_device_eval_batch_size`: 1
289
  - `per_gpu_train_batch_size`: None
290
  - `per_gpu_eval_batch_size`: None
291
+ - `gradient_accumulation_steps`: 16
292
  - `eval_accumulation_steps`: None
293
  - `torch_empty_cache_steps`: None
294
  - `learning_rate`: 8e-05
 
297
  - `adam_beta2`: 0.999
298
  - `adam_epsilon`: 1e-08
299
  - `max_grad_norm`: 1.0
300
+ - `num_train_epochs`: 10.0
301
  - `max_steps`: -1
302
  - `lr_scheduler_type`: linear
303
  - `lr_scheduler_kwargs`: {}
 
401
  ### Training Logs
402
  | Epoch | Step | Training Loss | Validation Loss | sts_dev_spearman_max |
403
  |:------:|:----:|:-------------:|:---------------:|:--------------------:|
404
+ | 0.1114 | 5 | - | 0.0377 | 0.7471 |
405
+ | 0.2228 | 10 | 0.6923 | 0.0377 | 0.7471 |
406
+ | 0.3343 | 15 | - | 0.0376 | 0.7473 |
407
+ | 0.4457 | 20 | 0.6832 | 0.0376 | 0.7475 |
408
+ | 0.5571 | 25 | - | 0.0375 | 0.7479 |
409
+ | 0.6685 | 30 | 0.6787 | 0.0375 | 0.7484 |
410
+ | 0.7799 | 35 | - | 0.0374 | 0.7488 |
411
+ | 0.8914 | 40 | 0.6154 | 0.0373 | 0.7494 |
412
+ | 1.0223 | 45 | - | 0.0372 | 0.7500 |
413
+ | 1.1337 | 50 | 0.6231 | 0.0371 | 0.7506 |
414
+ | 1.2451 | 55 | - | 0.0370 | 0.7512 |
415
+ | 1.3565 | 60 | 0.6562 | 0.0369 | 0.7519 |
416
+ | 1.4680 | 65 | - | 0.0368 | 0.7526 |
417
+ | 1.5794 | 70 | 0.6578 | 0.0366 | 0.7534 |
418
+ | 1.6908 | 75 | - | 0.0365 | 0.7541 |
419
+ | 1.8022 | 80 | 0.6669 | 0.0364 | 0.7549 |
420
+ | 1.9136 | 85 | - | 0.0363 | 0.7559 |
421
+ | 2.0446 | 90 | 0.6428 | 0.0361 | 0.7568 |
422
+ | 2.1560 | 95 | - | 0.0360 | 0.7577 |
423
+ | 2.2674 | 100 | 0.5854 | 0.0358 | 0.7586 |
424
+ | 2.3788 | 105 | - | 0.0357 | 0.7597 |
425
+ | 2.4903 | 110 | 0.6027 | 0.0356 | 0.7607 |
426
+ | 2.6017 | 115 | - | 0.0354 | 0.7618 |
427
+ | 2.7131 | 120 | 0.6375 | 0.0353 | 0.7627 |
428
+ | 2.8245 | 125 | - | 0.0351 | 0.7635 |
429
+ | 2.9359 | 130 | 0.6204 | 0.0350 | 0.7643 |
430
+ | 3.0669 | 135 | - | 0.0348 | 0.7653 |
431
+ | 3.1783 | 140 | 0.6077 | 0.0347 | 0.7663 |
432
+ | 3.2897 | 145 | - | 0.0346 | 0.7672 |
433
+ | 3.4011 | 150 | 0.5772 | 0.0344 | 0.7681 |
434
+ | 3.5125 | 155 | - | 0.0343 | 0.7690 |
435
+ | 3.6240 | 160 | 0.5793 | 0.0341 | 0.7698 |
436
+ | 3.7354 | 165 | - | 0.0340 | 0.7705 |
437
+ | 3.8468 | 170 | 0.5807 | 0.0338 | 0.7712 |
438
+ | 3.9582 | 175 | - | 0.0337 | 0.7721 |
439
+ | 4.0891 | 180 | 0.5576 | 0.0336 | 0.7729 |
440
+ | 4.2006 | 185 | - | 0.0334 | 0.7734 |
441
+ | 4.3120 | 190 | 0.5244 | 0.0333 | 0.7740 |
442
+ | 4.4234 | 195 | - | 0.0332 | 0.7748 |
443
+ | 4.5348 | 200 | 0.539 | 0.0331 | 0.7754 |
444
+ | 4.6462 | 205 | - | 0.0330 | 0.7760 |
445
+ | 4.7577 | 210 | 0.5517 | 0.0329 | 0.7765 |
446
+ | 4.8691 | 215 | - | 0.0328 | 0.7769 |
447
+ | 4.9805 | 220 | 0.5265 | 0.0327 | 0.7776 |
448
+ | 5.1114 | 225 | - | 0.0326 | 0.7780 |
449
+ | 5.2228 | 230 | 0.5285 | 0.0325 | 0.7783 |
450
+ | 5.3343 | 235 | - | 0.0324 | 0.7789 |
451
+ | 5.4457 | 240 | 0.4697 | 0.0323 | 0.7793 |
452
+ | 5.5571 | 245 | - | 0.0323 | 0.7798 |
453
+ | 5.6685 | 250 | 0.4913 | 0.0322 | 0.7804 |
454
+ | 5.7799 | 255 | - | 0.0321 | 0.7809 |
455
+ | 5.8914 | 260 | 0.5253 | 0.0320 | 0.7813 |
456
+ | 6.0223 | 265 | - | 0.0320 | 0.7817 |
457
+ | 6.1337 | 270 | 0.4924 | 0.0319 | 0.7819 |
458
+ | 6.2451 | 275 | - | 0.0318 | 0.7820 |
459
+ | 6.3565 | 280 | 0.4844 | 0.0317 | 0.7822 |
460
+ | 6.4680 | 285 | - | 0.0317 | 0.7825 |
461
+ | 6.5794 | 290 | 0.442 | 0.0316 | 0.7827 |
462
+ | 6.6908 | 295 | - | 0.0315 | 0.7830 |
463
+ | 6.8022 | 300 | 0.4665 | 0.0314 | 0.7834 |
464
+ | 6.9136 | 305 | - | 0.0314 | 0.7839 |
465
+ | 7.0446 | 310 | 0.4672 | 0.0314 | 0.7843 |
466
+ | 7.1560 | 315 | - | 0.0314 | 0.7851 |
467
+ | 7.2674 | 320 | 0.4131 | 0.0314 | 0.7850 |
468
+ | 7.3788 | 325 | - | 0.0313 | 0.7849 |
469
+ | 7.4903 | 330 | 0.4221 | 0.0312 | 0.7848 |
470
+ | 7.6017 | 335 | - | 0.0311 | 0.7854 |
471
+ | 7.7131 | 340 | 0.4268 | 0.0310 | 0.7857 |
472
+ | 7.8245 | 345 | - | 0.0309 | 0.7861 |
473
+ | 7.9359 | 350 | 0.4316 | 0.0309 | 0.7866 |
474
+ | 8.0669 | 355 | - | 0.0309 | 0.7872 |
475
+ | 8.1783 | 360 | 0.4277 | 0.0309 | 0.7873 |
476
+ | 8.2897 | 365 | - | 0.0308 | 0.7870 |
477
+ | 8.4011 | 370 | 0.3925 | 0.0308 | 0.7868 |
478
+ | 8.5125 | 375 | - | 0.0308 | 0.7866 |
479
+ | 8.6240 | 380 | 0.4049 | 0.0308 | 0.7869 |
480
+ | 8.7354 | 385 | - | 0.0308 | 0.7875 |
481
+ | 8.8468 | 390 | 0.3742 | 0.0308 | 0.7883 |
482
+ | 8.9582 | 395 | - | 0.0307 | 0.7885 |
483
+ | 9.0891 | 400 | 0.3498 | 0.0307 | 0.7886 |
484
+ | 9.2006 | 405 | - | 0.0307 | 0.7881 |
485
+ | 9.3120 | 410 | 0.3569 | 0.0307 | 0.7878 |
486
+ | 9.4234 | 415 | - | 0.0307 | 0.7876 |
487
+ | 9.5348 | 420 | 0.3312 | 0.0306 | 0.7877 |
488
+ | 9.6462 | 425 | - | 0.0305 | 0.7881 |
489
+ | 9.7577 | 430 | 0.3848 | 0.0304 | 0.7885 |
490
+ | 9.8691 | 435 | - | 0.0304 | 0.7889 |
491
+ | 9.9805 | 440 | 0.332 | 0.0305 | 0.7890 |
492
 
493
 
494
  ### Framework Versions
eval/similarity_evaluation_sts_dev_results.csv CHANGED
@@ -703,3 +703,43 @@ epoch,steps,cosine_pearson,cosine_spearman,euclidean_pearson,euclidean_spearman,
703
  9.423398328690809,415,0.7868969694010064,0.7876455382973624,0.718941367166557,0.7107299275769324,0.7208149791790056,0.7135530150298626,0.7111765993872031,0.704719651197283
704
  9.423398328690809,415,0.7868969694010064,0.7876455382973624,0.718941367166557,0.7107299275769324,0.7208149791790056,0.7135530150298626,0.7111765993872031,0.704719651197283
705
  9.423398328690809,415,0.7868969694010064,0.7876455382973624,0.718941367166557,0.7107299275769324,0.7208149791790056,0.7135530150298626,0.7111765993872031,0.704719651197283
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
703
  9.423398328690809,415,0.7868969694010064,0.7876455382973624,0.718941367166557,0.7107299275769324,0.7208149791790056,0.7135530150298626,0.7111765993872031,0.704719651197283
704
  9.423398328690809,415,0.7868969694010064,0.7876455382973624,0.718941367166557,0.7107299275769324,0.7208149791790056,0.7135530150298626,0.7111765993872031,0.704719651197283
705
  9.423398328690809,415,0.7868969694010064,0.7876455382973624,0.718941367166557,0.7107299275769324,0.7208149791790056,0.7135530150298626,0.7111765993872031,0.704719651197283
706
+ 9.534818941504179,420,0.7871582146601501,0.7876798303111584,0.7188247909750505,0.7105139703715937,0.7207347787107719,0.7134038294038588,0.7115609441039149,0.7054176024199568
707
+ 9.534818941504179,420,0.7871582146601501,0.7876798303111584,0.7188247909750505,0.7105139703715937,0.7207347787107719,0.7134038294038588,0.7115609441039149,0.7054176024199568
708
+ 9.534818941504179,420,0.7871582146601501,0.7876798303111584,0.7188247909750505,0.7105139703715937,0.7207347787107719,0.7134038294038588,0.7115609441039149,0.7054176024199568
709
+ 9.534818941504179,420,0.7871582146601501,0.7876798303111584,0.7188247909750505,0.7105139703715937,0.7207347787107719,0.7134038294038588,0.7115609441039149,0.7054176024199568
710
+ 9.534818941504179,420,0.7871582146601501,0.7876798303111584,0.7188247909750505,0.7105139703715937,0.7207347787107719,0.7134038294038588,0.7115609441039149,0.7054176024199568
711
+ 9.534818941504179,420,0.7871582146601501,0.7876798303111584,0.7188247909750505,0.7105139703715937,0.7207347787107719,0.7134038294038588,0.7115609441039149,0.7054176024199568
712
+ 9.534818941504179,420,0.7871582146601501,0.7876798303111584,0.7188247909750505,0.7105139703715937,0.7207347787107719,0.7134038294038588,0.7115609441039149,0.7054176024199568
713
+ 9.534818941504179,420,0.7871582146601501,0.7876798303111584,0.7188247909750505,0.7105139703715937,0.7207347787107719,0.7134038294038588,0.7115609441039149,0.7054176024199568
714
+ 9.64623955431755,425,0.7879156066971706,0.7881341181735492,0.7198865923068452,0.7117072407357017,0.7217957273879476,0.714723698124005,0.711201588266644,0.705110653419886
715
+ 9.64623955431755,425,0.7879156066971706,0.7881341181735492,0.7198865923068452,0.7117072407357017,0.7217957273879476,0.714723698124005,0.711201588266644,0.705110653419886
716
+ 9.64623955431755,425,0.7879156066971706,0.7881341181735492,0.7198865923068452,0.7117072407357017,0.7217957273879476,0.714723698124005,0.711201588266644,0.705110653419886
717
+ 9.64623955431755,425,0.7879156066971706,0.7881341181735492,0.7198865923068452,0.7117072407357017,0.7217957273879476,0.714723698124005,0.711201588266644,0.705110653419886
718
+ 9.64623955431755,425,0.7879156066971706,0.7881341181735492,0.7198865923068452,0.7117072407357017,0.7217957273879476,0.714723698124005,0.711201588266644,0.705110653419886
719
+ 9.64623955431755,425,0.7879156066971706,0.7881341181735492,0.7198865923068452,0.7117072407357017,0.7217957273879476,0.714723698124005,0.711201588266644,0.705110653419886
720
+ 9.64623955431755,425,0.7879156066971706,0.7881341181735492,0.7198865923068452,0.7117072407357017,0.7217957273879476,0.714723698124005,0.711201588266644,0.705110653419886
721
+ 9.64623955431755,425,0.7879156066971706,0.7881341181735492,0.7198865923068452,0.7117072407357017,0.7217957273879476,0.714723698124005,0.711201588266644,0.705110653419886
722
+ 9.757660167130918,430,0.788344661991361,0.788491517925351,0.7203673673579825,0.7122847759317772,0.7222617545181341,0.715345146664245,0.7113191054201157,0.7051474995781294
723
+ 9.757660167130918,430,0.788344661991361,0.788491517925351,0.7203673673579825,0.7122847759317772,0.7222617545181341,0.715345146664245,0.7113191054201157,0.7051474995781294
724
+ 9.757660167130918,430,0.788344661991361,0.788491517925351,0.7203673673579825,0.7122847759317772,0.7222617545181341,0.715345146664245,0.7113191054201157,0.7051474995781294
725
+ 9.757660167130918,430,0.788344661991361,0.788491517925351,0.7203673673579825,0.7122847759317772,0.7222617545181341,0.715345146664245,0.7113191054201157,0.7051474995781294
726
+ 9.757660167130918,430,0.788344661991361,0.788491517925351,0.7203673673579825,0.7122847759317772,0.7222617545181341,0.715345146664245,0.7113191054201157,0.7051474995781294
727
+ 9.757660167130918,430,0.788344661991361,0.788491517925351,0.7203673673579825,0.7122847759317772,0.7222617545181341,0.715345146664245,0.7113191054201157,0.7051474995781294
728
+ 9.757660167130918,430,0.788344661991361,0.788491517925351,0.7203673673579825,0.7122847759317772,0.7222617545181341,0.715345146664245,0.7113191054201157,0.7051474995781294
729
+ 9.757660167130918,430,0.788344661991361,0.788491517925351,0.7203673673579825,0.7122847759317772,0.7222617545181341,0.715345146664245,0.7113191054201157,0.7051474995781294
730
+ 9.869080779944289,435,0.7886447974538666,0.7889309986015577,0.7209074928596302,0.7130842264842505,0.7227708714770562,0.7159395965036961,0.7116018360759614,0.7053912897783909
731
+ 9.869080779944289,435,0.7886447974538666,0.7889309986015577,0.7209074928596302,0.7130842264842505,0.7227708714770562,0.7159395965036961,0.7116018360759614,0.7053912897783909
732
+ 9.869080779944289,435,0.7886447974538666,0.7889309986015577,0.7209074928596302,0.7130842264842505,0.7227708714770562,0.7159395965036961,0.7116018360759614,0.7053912897783909
733
+ 9.869080779944289,435,0.7886447974538666,0.7889309986015577,0.7209074928596302,0.7130842264842505,0.7227708714770562,0.7159395965036961,0.7116018360759614,0.7053912897783909
734
+ 9.869080779944289,435,0.7886447974538666,0.7889309986015577,0.7209074928596302,0.7130842264842505,0.7227708714770562,0.7159395965036961,0.7116018360759614,0.7053912897783909
735
+ 9.869080779944289,435,0.7886447974538666,0.7889309986015577,0.7209074928596302,0.7130842264842505,0.7227708714770562,0.7159395965036961,0.7116018360759614,0.7053912897783909
736
+ 9.869080779944289,435,0.7886447974538666,0.7889309986015577,0.7209074928596302,0.7130842264842505,0.7227708714770562,0.7159395965036961,0.7116018360759614,0.7053912897783909
737
+ 9.869080779944289,435,0.7886447974538666,0.7889309986015577,0.7209074928596302,0.7130842264842505,0.7227708714770562,0.7159395965036961,0.7116018360759614,0.7053912897783909
738
+ 9.98050139275766,440,0.7885728442437165,0.7890106880187878,0.7209624590910948,0.7132906703480484,0.7228003273015342,0.7161151111265872,0.7119673656141701,0.7059066541365785
739
+ 9.98050139275766,440,0.7885728442437165,0.7890106880187878,0.7209624590910948,0.7132906703480484,0.7228003273015342,0.7161151111265872,0.7119673656141701,0.7059066541365785
740
+ 9.98050139275766,440,0.7885728442437165,0.7890106880187878,0.7209624590910948,0.7132906703480484,0.7228003273015342,0.7161151111265872,0.7119673656141701,0.7059066541365785
741
+ 9.98050139275766,440,0.7885728442437165,0.7890106880187878,0.7209624590910948,0.7132906703480484,0.7228003273015342,0.7161151111265872,0.7119673656141701,0.7059066541365785
742
+ 9.98050139275766,440,0.7885728442437165,0.7890106880187878,0.7209624590910948,0.7132906703480484,0.7228003273015342,0.7161151111265872,0.7119673656141701,0.7059066541365785
743
+ 9.98050139275766,440,0.7885728442437165,0.7890106880187878,0.7209624590910948,0.7132906703480484,0.7228003273015342,0.7161151111265872,0.7119673656141701,0.7059066541365785
744
+ 9.98050139275766,440,0.7885728442437165,0.7890106880187878,0.7209624590910948,0.7132906703480484,0.7228003273015342,0.7161151111265872,0.7119673656141701,0.7059066541365785
745
+ 9.98050139275766,440,0.7885728442437165,0.7890106880187878,0.7209624590910948,0.7132906703480484,0.7228003273015342,0.7161151111265872,0.7119673656141701,0.7059066541365785
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9f26b706c07e140e2edd57fafcc709e1a43ae165be88a326e339c41e3237937a
3
  size 735216376
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:450580b5008ac6a91afd1af3da982e02887b07b2548c937195f47b59a050cb3f
3
  size 735216376