train_loss = AdaptiveLayerLoss(model=model,
Browse filesloss=train_loss,
n_layers_per_step = -1,
last_layer_weight = 1.5,
prior_layers_weight= 0.15,
kl_div_weight = 2,
kl_temperature= 2,
)
num_epochs = 2
learning_rate = 2e-5
warmup_ratio=0.25
weight_decay = 5e-7
schedule = "cosine_with_restarts"
num_cycles = 3
- README.md +258 -117
- pytorch_model.bin +2 -2
README.md
CHANGED
@@ -87,34 +87,124 @@ model-index:
|
|
87 |
type: sts-test
|
88 |
metrics:
|
89 |
- type: pearson_cosine
|
90 |
-
value: 0.
|
91 |
name: Pearson Cosine
|
92 |
- type: spearman_cosine
|
93 |
-
value: 0.
|
94 |
name: Spearman Cosine
|
95 |
- type: pearson_manhattan
|
96 |
-
value: 0.
|
97 |
name: Pearson Manhattan
|
98 |
- type: spearman_manhattan
|
99 |
-
value: 0.
|
100 |
name: Spearman Manhattan
|
101 |
- type: pearson_euclidean
|
102 |
-
value: 0.
|
103 |
name: Pearson Euclidean
|
104 |
- type: spearman_euclidean
|
105 |
-
value: 0.
|
106 |
name: Spearman Euclidean
|
107 |
- type: pearson_dot
|
108 |
-
value: 0.
|
109 |
name: Pearson Dot
|
110 |
- type: spearman_dot
|
111 |
-
value: 0.
|
112 |
name: Spearman Dot
|
113 |
- type: pearson_max
|
114 |
-
value: 0.
|
115 |
name: Pearson Max
|
116 |
- type: spearman_max
|
117 |
-
value: 0.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
118 |
name: Spearman Max
|
119 |
---
|
120 |
|
@@ -231,16 +321,67 @@ You can finetune this model on your own dataset.
|
|
231 |
|
232 |
| Metric | Value |
|
233 |
|:--------------------|:-----------|
|
234 |
-
| pearson_cosine | 0.
|
235 |
-
| **spearman_cosine** | **0.
|
236 |
-
| pearson_manhattan | 0.
|
237 |
-
| spearman_manhattan | 0.
|
238 |
-
| pearson_euclidean | 0.
|
239 |
-
| spearman_euclidean | 0.
|
240 |
-
| pearson_dot | 0.
|
241 |
-
| spearman_dot | 0.
|
242 |
-
| pearson_max | 0.
|
243 |
-
| spearman_max | 0.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
244 |
|
245 |
<!--
|
246 |
## Bias, Risks and Limitations
|
@@ -316,16 +457,16 @@ You can finetune this model on your own dataset.
|
|
316 |
* Size: 3,194 training samples
|
317 |
* Columns: <code>label</code>, <code>sentence1</code>, and <code>sentence2</code>
|
318 |
* Approximate statistics based on the first 1000 samples:
|
319 |
-
| | label | sentence1
|
320 |
-
|
321 |
-
| type | int | string
|
322 |
-
| details | <ul><li>1: 100.00%</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.
|
323 |
* Samples:
|
324 |
-
| label | sentence1
|
325 |
-
|
326 |
-
| <code>1</code> | <code>
|
327 |
-
| <code>1</code> | <code>
|
328 |
-
| <code>1</code> | <code>
|
329 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
330 |
```json
|
331 |
{
|
@@ -344,16 +485,16 @@ You can finetune this model on your own dataset.
|
|
344 |
* Size: 4,000 training samples
|
345 |
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
|
346 |
* Approximate statistics based on the first 1000 samples:
|
347 |
-
| | sentence1 | sentence2
|
348 |
-
|
349 |
-
| type | string | string
|
350 |
-
| details | <ul><li>min: 6 tokens</li><li>mean: 13.
|
351 |
* Samples:
|
352 |
-
| sentence1
|
353 |
-
|
354 |
-
| <code>
|
355 |
-
| <code>
|
356 |
-
| <code>
|
357 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
358 |
```json
|
359 |
{
|
@@ -375,13 +516,13 @@ You can finetune this model on your own dataset.
|
|
375 |
| | sentence2 | sentence1 |
|
376 |
|:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
|
377 |
| type | string | string |
|
378 |
-
| details | <ul><li>min: 7 tokens</li><li>mean: 16.
|
379 |
* Samples:
|
380 |
-
| sentence2
|
381 |
-
|
382 |
-
| <code>
|
383 |
-
| <code>
|
384 |
-
| <code>A
|
385 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
386 |
```json
|
387 |
{
|
@@ -400,16 +541,16 @@ You can finetune this model on your own dataset.
|
|
400 |
* Size: 2,200 training samples
|
401 |
* Columns: <code>sentence1</code> and <code>sentence2</code>
|
402 |
* Approximate statistics based on the first 1000 samples:
|
403 |
-
| | sentence1
|
404 |
-
|
405 |
-
| type | string
|
406 |
-
| details | <ul><li>min: 7 tokens</li><li>mean: 23.
|
407 |
* Samples:
|
408 |
-
| sentence1
|
409 |
-
|
410 |
-
| <code>
|
411 |
-
| <code>
|
412 |
-
| <code>
|
413 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
414 |
```json
|
415 |
{
|
@@ -428,16 +569,16 @@ You can finetune this model on your own dataset.
|
|
428 |
* Size: 2,500 training samples
|
429 |
* Columns: <code>sentence1</code> and <code>sentence2</code>
|
430 |
* Approximate statistics based on the first 1000 samples:
|
431 |
-
| | sentence1
|
432 |
-
|
433 |
-
| type | string
|
434 |
-
| details | <ul><li>min:
|
435 |
* Samples:
|
436 |
-
| sentence1
|
437 |
-
|
438 |
-
| <code>
|
439 |
-
| <code>
|
440 |
-
| <code>
|
441 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
442 |
```json
|
443 |
{
|
@@ -796,9 +937,9 @@ You can finetune this model on your own dataset.
|
|
796 |
- `per_device_eval_batch_size`: 18
|
797 |
- `learning_rate`: 2e-05
|
798 |
- `weight_decay`: 5e-07
|
799 |
-
- `num_train_epochs`:
|
800 |
- `lr_scheduler_type`: cosine_with_restarts
|
801 |
-
- `lr_scheduler_kwargs`: {'num_cycles':
|
802 |
- `warmup_ratio`: 0.25
|
803 |
- `save_safetensors`: False
|
804 |
- `fp16`: True
|
@@ -826,10 +967,10 @@ You can finetune this model on your own dataset.
|
|
826 |
- `adam_beta2`: 0.999
|
827 |
- `adam_epsilon`: 1e-08
|
828 |
- `max_grad_norm`: 1.0
|
829 |
-
- `num_train_epochs`:
|
830 |
- `max_steps`: -1
|
831 |
- `lr_scheduler_type`: cosine_with_restarts
|
832 |
-
- `lr_scheduler_kwargs`: {'num_cycles':
|
833 |
- `warmup_ratio`: 0.25
|
834 |
- `warmup_steps`: 0
|
835 |
- `log_level`: passive
|
@@ -922,57 +1063,57 @@ You can finetune this model on your own dataset.
|
|
922 |
</details>
|
923 |
|
924 |
### Training Logs
|
925 |
-
| Epoch | Step
|
926 |
-
|
927 |
-
| 0.
|
928 |
-
| 0.
|
929 |
-
| 0.
|
930 |
-
| 0.
|
931 |
-
| 0.
|
932 |
-
| 0.
|
933 |
-
| 0.
|
934 |
-
| 0.
|
935 |
-
| 0.
|
936 |
-
| 0.
|
937 |
-
| 0.
|
938 |
-
|
|
939 |
-
|
|
940 |
-
|
|
941 |
-
|
|
942 |
-
|
|
943 |
-
|
|
944 |
-
|
|
945 |
-
|
|
946 |
-
|
|
947 |
-
|
|
948 |
-
|
|
949 |
-
|
|
950 |
-
|
|
951 |
-
|
|
952 |
-
|
|
953 |
-
|
|
954 |
-
|
|
955 |
-
|
|
956 |
-
|
|
957 |
-
|
|
958 |
-
|
|
959 |
-
|
|
960 |
-
|
|
961 |
-
|
|
962 |
-
|
|
963 |
-
|
|
964 |
-
|
|
965 |
-
|
|
966 |
-
|
|
967 |
-
|
|
968 |
-
|
|
969 |
-
|
|
970 |
-
|
|
971 |
-
|
|
972 |
-
|
|
973 |
-
|
|
974 |
-
|
|
975 |
-
|
|
976 |
|
977 |
|
978 |
### Framework Versions
|
|
|
87 |
type: sts-test
|
88 |
metrics:
|
89 |
- type: pearson_cosine
|
90 |
+
value: 0.566653720937157
|
91 |
name: Pearson Cosine
|
92 |
- type: spearman_cosine
|
93 |
+
value: 0.5551442914704277
|
94 |
name: Spearman Cosine
|
95 |
- type: pearson_manhattan
|
96 |
+
value: 0.5771354814213894
|
97 |
name: Pearson Manhattan
|
98 |
- type: spearman_manhattan
|
99 |
+
value: 0.5723970841918167
|
100 |
name: Spearman Manhattan
|
101 |
- type: pearson_euclidean
|
102 |
+
value: 0.5619024776733639
|
103 |
name: Pearson Euclidean
|
104 |
- type: spearman_euclidean
|
105 |
+
value: 0.5593253322063549
|
106 |
name: Spearman Euclidean
|
107 |
- type: pearson_dot
|
108 |
+
value: 0.23527108587659004
|
109 |
name: Pearson Dot
|
110 |
- type: spearman_dot
|
111 |
+
value: 0.24219982461742934
|
112 |
name: Spearman Dot
|
113 |
- type: pearson_max
|
114 |
+
value: 0.5771354814213894
|
115 |
name: Pearson Max
|
116 |
- type: spearman_max
|
117 |
+
value: 0.5723970841918167
|
118 |
+
name: Spearman Max
|
119 |
+
- type: pearson_cosine
|
120 |
+
value: 0.566653720937157
|
121 |
+
name: Pearson Cosine
|
122 |
+
- type: spearman_cosine
|
123 |
+
value: 0.5551442914704277
|
124 |
+
name: Spearman Cosine
|
125 |
+
- type: pearson_manhattan
|
126 |
+
value: 0.5771354814213894
|
127 |
+
name: Pearson Manhattan
|
128 |
+
- type: spearman_manhattan
|
129 |
+
value: 0.5723970841918167
|
130 |
+
name: Spearman Manhattan
|
131 |
+
- type: pearson_euclidean
|
132 |
+
value: 0.5619024776733639
|
133 |
+
name: Pearson Euclidean
|
134 |
+
- type: spearman_euclidean
|
135 |
+
value: 0.5593253322063549
|
136 |
+
name: Spearman Euclidean
|
137 |
+
- type: pearson_dot
|
138 |
+
value: 0.23527108587659004
|
139 |
+
name: Pearson Dot
|
140 |
+
- type: spearman_dot
|
141 |
+
value: 0.24219982461742934
|
142 |
+
name: Spearman Dot
|
143 |
+
- type: pearson_max
|
144 |
+
value: 0.5771354814213894
|
145 |
+
name: Pearson Max
|
146 |
+
- type: spearman_max
|
147 |
+
value: 0.5723970841918167
|
148 |
+
name: Spearman Max
|
149 |
+
- type: pearson_cosine
|
150 |
+
value: 0.566653720937157
|
151 |
+
name: Pearson Cosine
|
152 |
+
- type: spearman_cosine
|
153 |
+
value: 0.5551442914704277
|
154 |
+
name: Spearman Cosine
|
155 |
+
- type: pearson_manhattan
|
156 |
+
value: 0.5771354814213894
|
157 |
+
name: Pearson Manhattan
|
158 |
+
- type: spearman_manhattan
|
159 |
+
value: 0.5723970841918167
|
160 |
+
name: Spearman Manhattan
|
161 |
+
- type: pearson_euclidean
|
162 |
+
value: 0.5619024776733639
|
163 |
+
name: Pearson Euclidean
|
164 |
+
- type: spearman_euclidean
|
165 |
+
value: 0.5593253322063549
|
166 |
+
name: Spearman Euclidean
|
167 |
+
- type: pearson_dot
|
168 |
+
value: 0.23527108587659004
|
169 |
+
name: Pearson Dot
|
170 |
+
- type: spearman_dot
|
171 |
+
value: 0.24219982461742934
|
172 |
+
name: Spearman Dot
|
173 |
+
- type: pearson_max
|
174 |
+
value: 0.5771354814213894
|
175 |
+
name: Pearson Max
|
176 |
+
- type: spearman_max
|
177 |
+
value: 0.5723970841918167
|
178 |
+
name: Spearman Max
|
179 |
+
- type: pearson_cosine
|
180 |
+
value: 0.566653720937157
|
181 |
+
name: Pearson Cosine
|
182 |
+
- type: spearman_cosine
|
183 |
+
value: 0.5551442914704277
|
184 |
+
name: Spearman Cosine
|
185 |
+
- type: pearson_manhattan
|
186 |
+
value: 0.5771354814213894
|
187 |
+
name: Pearson Manhattan
|
188 |
+
- type: spearman_manhattan
|
189 |
+
value: 0.5723970841918167
|
190 |
+
name: Spearman Manhattan
|
191 |
+
- type: pearson_euclidean
|
192 |
+
value: 0.5619024776733639
|
193 |
+
name: Pearson Euclidean
|
194 |
+
- type: spearman_euclidean
|
195 |
+
value: 0.5593253322063549
|
196 |
+
name: Spearman Euclidean
|
197 |
+
- type: pearson_dot
|
198 |
+
value: 0.23527108587659004
|
199 |
+
name: Pearson Dot
|
200 |
+
- type: spearman_dot
|
201 |
+
value: 0.24219982461742934
|
202 |
+
name: Spearman Dot
|
203 |
+
- type: pearson_max
|
204 |
+
value: 0.5771354814213894
|
205 |
+
name: Pearson Max
|
206 |
+
- type: spearman_max
|
207 |
+
value: 0.5723970841918167
|
208 |
name: Spearman Max
|
209 |
---
|
210 |
|
|
|
321 |
|
322 |
| Metric | Value |
|
323 |
|:--------------------|:-----------|
|
324 |
+
| pearson_cosine | 0.5667 |
|
325 |
+
| **spearman_cosine** | **0.5551** |
|
326 |
+
| pearson_manhattan | 0.5771 |
|
327 |
+
| spearman_manhattan | 0.5724 |
|
328 |
+
| pearson_euclidean | 0.5619 |
|
329 |
+
| spearman_euclidean | 0.5593 |
|
330 |
+
| pearson_dot | 0.2353 |
|
331 |
+
| spearman_dot | 0.2422 |
|
332 |
+
| pearson_max | 0.5771 |
|
333 |
+
| spearman_max | 0.5724 |
|
334 |
+
|
335 |
+
#### Semantic Similarity
|
336 |
+
* Dataset: `sts-test`
|
337 |
+
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
|
338 |
+
|
339 |
+
| Metric | Value |
|
340 |
+
|:--------------------|:-----------|
|
341 |
+
| pearson_cosine | 0.5667 |
|
342 |
+
| **spearman_cosine** | **0.5551** |
|
343 |
+
| pearson_manhattan | 0.5771 |
|
344 |
+
| spearman_manhattan | 0.5724 |
|
345 |
+
| pearson_euclidean | 0.5619 |
|
346 |
+
| spearman_euclidean | 0.5593 |
|
347 |
+
| pearson_dot | 0.2353 |
|
348 |
+
| spearman_dot | 0.2422 |
|
349 |
+
| pearson_max | 0.5771 |
|
350 |
+
| spearman_max | 0.5724 |
|
351 |
+
|
352 |
+
#### Semantic Similarity
|
353 |
+
* Dataset: `sts-test`
|
354 |
+
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
|
355 |
+
|
356 |
+
| Metric | Value |
|
357 |
+
|:--------------------|:-----------|
|
358 |
+
| pearson_cosine | 0.5667 |
|
359 |
+
| **spearman_cosine** | **0.5551** |
|
360 |
+
| pearson_manhattan | 0.5771 |
|
361 |
+
| spearman_manhattan | 0.5724 |
|
362 |
+
| pearson_euclidean | 0.5619 |
|
363 |
+
| spearman_euclidean | 0.5593 |
|
364 |
+
| pearson_dot | 0.2353 |
|
365 |
+
| spearman_dot | 0.2422 |
|
366 |
+
| pearson_max | 0.5771 |
|
367 |
+
| spearman_max | 0.5724 |
|
368 |
+
|
369 |
+
#### Semantic Similarity
|
370 |
+
* Dataset: `sts-test`
|
371 |
+
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
|
372 |
+
|
373 |
+
| Metric | Value |
|
374 |
+
|:--------------------|:-----------|
|
375 |
+
| pearson_cosine | 0.5667 |
|
376 |
+
| **spearman_cosine** | **0.5551** |
|
377 |
+
| pearson_manhattan | 0.5771 |
|
378 |
+
| spearman_manhattan | 0.5724 |
|
379 |
+
| pearson_euclidean | 0.5619 |
|
380 |
+
| spearman_euclidean | 0.5593 |
|
381 |
+
| pearson_dot | 0.2353 |
|
382 |
+
| spearman_dot | 0.2422 |
|
383 |
+
| pearson_max | 0.5771 |
|
384 |
+
| spearman_max | 0.5724 |
|
385 |
|
386 |
<!--
|
387 |
## Bias, Risks and Limitations
|
|
|
457 |
* Size: 3,194 training samples
|
458 |
* Columns: <code>label</code>, <code>sentence1</code>, and <code>sentence2</code>
|
459 |
* Approximate statistics based on the first 1000 samples:
|
460 |
+
| | label | sentence1 | sentence2 |
|
461 |
+
|:--------|:-----------------------------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
|
462 |
+
| type | int | string | string |
|
463 |
+
| details | <ul><li>1: 100.00%</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.8 tokens</li><li>max: 75 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 38.29 tokens</li><li>max: 512 tokens</li></ul> |
|
464 |
* Samples:
|
465 |
+
| label | sentence1 | sentence2 |
|
466 |
+
|:---------------|:---------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
467 |
+
| <code>1</code> | <code>Kyle Kendricks was otherwise called the Professor .</code> | <code>`` Chicago Cubs ( �present ) } } Kyle Christian Hendricks ( born December 7 , 1989 ) , nicknamed `` '' The Proffessor , '' '' is an American professional baseball pitcher for the Chicago Cubs of Major League Baseball ( MLB ) . ''</code> |
|
468 |
+
| <code>1</code> | <code>Since 1982 , 533 people have been executed in Texas .</code> | <code>Since the death penalty was re-instituted in the United States with the 1976 Gregg v. Georgia decision , Texas has executed more inmates than any other state , beginning in 1982 with the execution of Charles Brooks , Jr.. Since 1982 , 533 people have been executed in Texas. 1923 , the Texas Department of Criminal Justice ( TDCJ ) has been in charge of executions in the state .</code> |
|
469 |
+
| <code>1</code> | <code>Hilltop Hoods have released two `` restrung '' albums .</code> | <code>`` The group released its first extended play , Back Once Again , in 1997 and have subsequently released seven studio albums , two `` '' restrung '' '' albums and three DVDs . ''</code> |
|
470 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
471 |
```json
|
472 |
{
|
|
|
485 |
* Size: 4,000 training samples
|
486 |
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
|
487 |
* Approximate statistics based on the first 1000 samples:
|
488 |
+
| | sentence1 | sentence2 | label |
|
489 |
+
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------|
|
490 |
+
| type | string | string | int |
|
491 |
+
| details | <ul><li>min: 6 tokens</li><li>mean: 13.79 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 35.8 tokens</li><li>max: 499 tokens</li></ul> | <ul><li>0: 100.00%</li></ul> |
|
492 |
* Samples:
|
493 |
+
| sentence1 | sentence2 | label |
|
494 |
+
|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
|
495 |
+
| <code>Vinters have adopted solar technology to do what?</code> | <code>More recently the technology has been embraced by vinters, who use the energy generated by solar panels to power grape presses.</code> | <code>0</code> |
|
496 |
+
| <code>Who did Madonna's look and style of dressing influence?</code> | <code>It attracted the attention of organizations who complained that the song and its accompanying video promoted premarital sex and undermined family values, and moralists sought to have the song and video banned.</code> | <code>0</code> |
|
497 |
+
| <code>In addition to hearing him play, what else did people seek from Chopin in London?</code> | <code>The Prince, who was himself a talented musician, moved close to the keyboard to view Chopin's technique.</code> | <code>0</code> |
|
498 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
499 |
```json
|
500 |
{
|
|
|
516 |
| | sentence2 | sentence1 |
|
517 |
|:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
|
518 |
| type | string | string |
|
519 |
+
| details | <ul><li>min: 7 tokens</li><li>mean: 16.0 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 14.71 tokens</li><li>max: 34 tokens</li></ul> |
|
520 |
* Samples:
|
521 |
+
| sentence2 | sentence1 |
|
522 |
+
|:--------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|
|
523 |
+
| <code>The fetal period lasts approximately 30 weeks weeks.</code> | <code>Approximately how many weeks does the fetal period last?</code> |
|
524 |
+
| <code>Corals build hard exoskeletons that grow to become coral reefs.</code> | <code>Corals build hard exoskeletons that grow to become what?</code> |
|
525 |
+
| <code>A voltaic cell generates an electric current through a reaction known as a(n) spontaneous redox.</code> | <code>A voltaic cell uses what type of reaction to generate an electric current</code> |
|
526 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
527 |
```json
|
528 |
{
|
|
|
541 |
* Size: 2,200 training samples
|
542 |
* Columns: <code>sentence1</code> and <code>sentence2</code>
|
543 |
* Approximate statistics based on the first 1000 samples:
|
544 |
+
| | sentence1 | sentence2 |
|
545 |
+
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
|
546 |
+
| type | string | string |
|
547 |
+
| details | <ul><li>min: 7 tokens</li><li>mean: 23.76 tokens</li><li>max: 74 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.27 tokens</li><li>max: 41 tokens</li></ul> |
|
548 |
* Samples:
|
549 |
+
| sentence1 | sentence2 |
|
550 |
+
|:-----------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------|
|
551 |
+
| <code>As the water vapor cools, it condenses , forming tiny droplets in clouds.</code> | <code>Clouds are formed from water droplets.</code> |
|
552 |
+
| <code>Poison ivy is green, with three leaflets on each leaf, grows as a shrub or vine, and may be in your yard.</code> | <code>Poison ivy typically has three groups of leaves.</code> |
|
553 |
+
| <code>(Formic acid is the poison found in the > sting of fire ants.)</code> | <code>Formic acid is found in the secretions of stinging ants.</code> |
|
554 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
555 |
```json
|
556 |
{
|
|
|
569 |
* Size: 2,500 training samples
|
570 |
* Columns: <code>sentence1</code> and <code>sentence2</code>
|
571 |
* Approximate statistics based on the first 1000 samples:
|
572 |
+
| | sentence1 | sentence2 |
|
573 |
+
|:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
|
574 |
+
| type | string | string |
|
575 |
+
| details | <ul><li>min: 14 tokens</li><li>mean: 345.33 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 27.11 tokens</li><li>max: 60 tokens</li></ul> |
|
576 |
* Samples:
|
577 |
+
| sentence1 | sentence2 |
|
578 |
+
|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------|
|
579 |
+
| <code>Rahim Kalantar told the BBC his son Ali, 18, travelled to Syria with two friends from Coventry in March and believed he was now fighting with Isis.<br>He said he was sent "down this road" by an imam - who denied the allegations.<br>Up to 500 Britons are thought to have travelled to the Middle East to fight in the conflict, officials say.<br>Mr Kalantar - speaking to BBC Two's Newsnight, in collaboration with the BBC's Afghan Service and Newsday - said he worries about his son Ali "every minute" and that his grief is "limitless".<br>He said he believed Ali - who was planning to study computer science at university - had been radicalised during classes at a mosque after evening prayer.<br>"He [the imam] encouraged them and sent them down this road," he said.<br>The BBC contacted the mosque to speak to the imam, who refused to give an interview but said he completely denied the allegations.<br>Ali is believed to have travelled to Syria with Rashed Amani, also 18, who had been studying business at Coventry University.<br>Rashed's father, Khabir, said family members had travelled to the Turkish-Syrian border in the hope of finding the boys, but came back "empty-handed" after searching for more than two weeks.<br>He said he did not know what had happened to his son, who he fears has joined Isis - the militant-led group that has made rapid advances through Iraq in recent weeks.<br>"Maybe somebody worked with him, I don't know. Maybe somebody brainwashed him because he was not like that," he said.<br>The third teenager, Moh Ismael, is also believed to be in Syria with his friends. He is understood to have posted a message on Twitter saying he was with Isis.<br>It comes after Britons - including Reyaad Khan and Nasser Muthana from Cardiff - featured in an apparent recruitment video for jihadists in Iraq and Syria.<br>The video was posted online on Friday by accounts with links to Isis.<br>The BBC has learned a third Briton in the video is from Aberdeen. The man, named locally as Raqib, grew up in Scotland but was originally from Bangladesh.<br>Lord Carlile, a former independent reviewer of terrorism laws, told the BBC that the Muslim community was best placed to stop jihadists recruiting in the UK.<br>The Liberal Democrat peer also said the UK needed to reintroduce tougher measures to stop terrorism.<br>It comes after former MI6 director, Richard Barrett, said security services would not be able to track all Britons who return to the UK after fighting in Syria.<br>He said the number of those posing a threat would be small but unpredictable.<br>The Metropolitan Police has insisted it has the tools to monitor British jihadists returning from that country.<br>Shiraz Maher, a radicalisation expert, told Newsnight that social media was now acting as a recruitment ground for potential jihadists in the UK.<br>"You have hundreds of foreign fighters on the ground who in real time are giving you a live feed of what is happening and they are engaged in a conversation.<br>"It is these individual people who have been empowered to become recruiters in their own right," he said.<br>Lord Carlile said the "most important partners" in preventing young Muslims from being radicalised were the "Muslim communities themselves".<br>"Mothers, wives, sisters do not want their husbands, brothers, sons to become valid jihadists and run the risk of being killed in a civil war," he told the programme.<br>He also told BBC Radio 4's World at One programme that the government should look at reintroducing "something like control orders", which were scrapped in 2011 and replaced with the less restrictive Terrorism Prevention and Investigation Measures (TPims).<br>He said: "We need to look at preventing violent extremism before people leave the country and also we need to look for further measures."</code> | <code>The father of a British teenager who travelled to Syria to join jihadists believes his son was radicalised by an imam at a UK mosque.</code> |
|
580 |
+
| <code>Jawad Fairooz and Matar Matar were detained in May after resigning from parliament in protest at the handling of the protests.<br>Mr Matar told the BBC they had been tortured in prison.<br>They were prosecuted in a security court on charges of taking part in illegal protests and defaming the country.<br>It is not clear if they still face trial in a civilian court.<br>Civilian courts took over jurisdiction after King Hamad Bin Issa Al Khalifa lifted a state of emergency in June.<br>Mr Matar told the BBC he believed his arrest had been intended to put a pressure on his al-Wifaq party.<br>"At some stages we were tortured," he said. "In one of the cases we were beaten."<br>Human rights lawyer Mohamed al-Tajir was also released.<br>He was detained in April having defended people arrested during the Saudi-backed suppression of protests in March.<br>Correspondents say their release appears to be an attempt at defusing tensions in the country, a key US ally in the region that hosts the US Navy's 5th Fleet.<br>Bahrain's King Hamad Bin Issa Al Khalifa recently accepted a series of reforms drawn up by a government-backed committee created to address grievances that emerged during the protests.<br>The kingdom's Shia community makes up about 70% of the population but many say they are discriminated against by the minority Sunni monarchy.</code> | <code>Bahrain has freed two former Shia opposition MPs arrested in the wake of widespread anti-government protests.</code> |
|
581 |
+
| <code>Liverpool City Region, in case you were wondering, includes Merseyside's five councils (Knowsley, Liverpool, Sefton, St Helens, and Wirral) as well as Halton in Cheshire.<br>Who are the eight candidates desperate for your support on 4 May, though, and what are their priorities?<br>BBC Radio Merseyside's political reporter Claire Hamilton has produced a potted biography for each of them.<br>We're also asking all of them for a "minute manifesto" video.<br>Candidates are listed below in alphabetical order<br>Roger Bannister, Trade Union & Socialist Coalition<br>Veteran trade unionist Roger Bannister believes the Liverpool City Region Combined Authority should never have approved the contract for a fleet of new driver-only Merseyrail trains. He says he would seek to reverse this decision. He also believes local authorities have passed harmful austerity budgets on people struggling to make ends meet. He stood for Liverpool city mayor in 2016, coming fourth with 5% of the vote.<br>Paul Breen, Get the Coppers off the Jury<br>Paul Breen is a resident of Norris Green, Liverpool and became the last candidate to be nominated. He is listed as treasurer of the party on the Electoral Commission's website, with Patricia Breen listed as deputy treasurer. He has not yet released any material detailing his manifesto but told the BBC the title of his campaign speaks for itself. He simply does not believe that police officers should be allowed to serve on juries.<br>Mr Breen declined to provide a "minute manifesto"<br>Tony Caldeira, Conservative<br>Born in Liverpool and educated in St Helens, Tony Caldeira started out working on a stall selling cushions made by his mother at Liverpool's Great Homer Street market. His business expanded and now operates in Kirkby, distributing world-wide. Mr Caldeira has stood for Liverpool mayor twice, coming sixth in 2016 with just under 4% of the vote. He has pledged to improve the area's transport network, speed up the planning process and build homes and workplaces on brownfield sites rather than green spaces.<br>Carl Cashman, Liberal Democrats<br>Born in Whiston, Knowsley, Carl Cashman is leader of the Liberal Democrat group on Knowsley Council. He and his two Lib Dem council colleagues were elected in 2016, breaking a four-year period when Labour was the only party represented. Aged 25, he's the youngest of the candidates. Mr Cashman believes maintaining strong ties with Europe and the region will be key, and has pledged to open a Liverpool City Region embassy in Brussels. He also wants to better integrate ticketing across public transport and make the current Walrus card more similar to the Oyster card used by Londoners.<br>Tom Crone, Green Party<br>Tom Crone is leader of the Green group on Liverpool City Council. He won 10% of the vote in the mayoral elections in Liverpool in 2016 and came third. Originally from Norwich, he has lived in Liverpool since 2000 after arriving as a student. Mr Crone is keen to see a shift away from traditional heavy industry in the city region towards greener "tech" industries. He's also passionate about making public transport more affordable and environmentally friendly. He says he'll look to prioritise new routes for cyclists and pedestrians.<br>Tabitha Morton, Women's Equality Party<br>Tabitha Morton was born in Netherton, Sefton. She left school with no formal qualifications, and started work at 16 at a local market, and later in cleaning. She was taken on for NVQ training by a company in Liverpool, and stayed on to train others. She now works for a global manufacturer, in what she describes as "a male-dominated industry". She says she would prioritise grants for employers offering equal apprenticeships for young women and men and ring-fence funds for training women in sectors in which they're underrepresented.<br>Steve Rotheram, Labour<br>Born in Kirkby, former bricklayer Steve Rotheram was a city councillor in Liverpool and also Lord Mayor during the city's European Capital of Culture year in 2008. He was also elected MP for Liverpool Walton in 2010, and re-elected to the seat in 2015. Mr Rotheram is pledging to cut the cost of the fast tag for motorists driving through the Mersey tunnels. He wants to improve education and offer better careers advice for young people, and also wants to make brownfield sites more attractive to developers.<br>Paula Walters, UKIP<br>Wallasey-born Paula Walters is chairman of UKIP in Wirral and lives in New Brighton with her family. She has campaigned to scrap tunnel tolls for several years. She says her local UKIP branch is one of the most thriving in the North West. A civil servant, she studied English and biomolecular science at degree-level. She has also lived in South Africa where she attended the University of Pretoria. She believes Liverpool city centre has attracted money at the expense of outlying areas, one of the things she wants to tackle.</code> | <code>Those hoping to become the first mayor of the Liverpool City Region have less than a month remaining in which to secure your vote.</code> |
|
582 |
* Loss: [<code>AdaptiveLayerLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters:
|
583 |
```json
|
584 |
{
|
|
|
937 |
- `per_device_eval_batch_size`: 18
|
938 |
- `learning_rate`: 2e-05
|
939 |
- `weight_decay`: 5e-07
|
940 |
+
- `num_train_epochs`: 4
|
941 |
- `lr_scheduler_type`: cosine_with_restarts
|
942 |
+
- `lr_scheduler_kwargs`: {'num_cycles': 5}
|
943 |
- `warmup_ratio`: 0.25
|
944 |
- `save_safetensors`: False
|
945 |
- `fp16`: True
|
|
|
967 |
- `adam_beta2`: 0.999
|
968 |
- `adam_epsilon`: 1e-08
|
969 |
- `max_grad_norm`: 1.0
|
970 |
+
- `num_train_epochs`: 4
|
971 |
- `max_steps`: -1
|
972 |
- `lr_scheduler_type`: cosine_with_restarts
|
973 |
+
- `lr_scheduler_kwargs`: {'num_cycles': 5}
|
974 |
- `warmup_ratio`: 0.25
|
975 |
- `warmup_steps`: 0
|
976 |
- `log_level`: passive
|
|
|
1063 |
</details>
|
1064 |
|
1065 |
### Training Logs
|
1066 |
+
| Epoch | Step | Training Loss | scitail-pairs-pos loss | qnli-contrastive loss | nli-pairs loss | sts-test_spearman_cosine |
|
1067 |
+
|:------:|:-----:|:-------------:|:----------------------:|:---------------------:|:--------------:|:------------------------:|
|
1068 |
+
| 0.1003 | 281 | 8.4339 | - | - | - | - |
|
1069 |
+
| 0.2006 | 562 | 6.8644 | - | - | - | - |
|
1070 |
+
| 0.3009 | 843 | 5.1225 | - | - | - | - |
|
1071 |
+
| 0.4001 | 1121 | - | 2.4070 | 4.2827 | 3.6032 | - |
|
1072 |
+
| 0.4011 | 1124 | 3.9997 | - | - | - | - |
|
1073 |
+
| 0.5014 | 1405 | 3.6186 | - | - | - | - |
|
1074 |
+
| 0.6017 | 1686 | 3.259 | - | - | - | - |
|
1075 |
+
| 0.7020 | 1967 | 3.1712 | - | - | - | - |
|
1076 |
+
| 0.8001 | 2242 | - | 1.6090 | 2.5195 | 2.2851 | - |
|
1077 |
+
| 0.8023 | 2248 | 3.104 | - | - | - | - |
|
1078 |
+
| 0.9026 | 2529 | 2.8549 | - | - | - | - |
|
1079 |
+
| 1.0029 | 2810 | 2.8668 | - | - | - | - |
|
1080 |
+
| 1.1031 | 3091 | 2.7466 | - | - | - | - |
|
1081 |
+
| 1.2002 | 3363 | - | 1.3474 | 2.2222 | 1.8491 | - |
|
1082 |
+
| 1.2034 | 3372 | 2.6502 | - | - | - | - |
|
1083 |
+
| 1.3037 | 3653 | 2.2191 | - | - | - | - |
|
1084 |
+
| 1.4040 | 3934 | 2.2311 | - | - | - | - |
|
1085 |
+
| 1.5043 | 4215 | 2.22 | - | - | - | - |
|
1086 |
+
| 1.6003 | 4484 | - | 1.2671 | 1.7964 | 1.6444 | - |
|
1087 |
+
| 1.6046 | 4496 | 2.1372 | - | - | - | - |
|
1088 |
+
| 1.7049 | 4777 | 2.2219 | - | - | - | - |
|
1089 |
+
| 1.8051 | 5058 | 2.2618 | - | - | - | - |
|
1090 |
+
| 1.9054 | 5339 | 1.9995 | - | - | - | - |
|
1091 |
+
| 2.0004 | 5605 | - | 1.2434 | 1.8182 | 1.5385 | - |
|
1092 |
+
| 2.0057 | 5620 | 1.9757 | - | - | - | - |
|
1093 |
+
| 2.1060 | 5901 | 2.0401 | - | - | - | - |
|
1094 |
+
| 2.2063 | 6182 | 1.9818 | - | - | - | - |
|
1095 |
+
| 2.3066 | 6463 | 1.7816 | - | - | - | - |
|
1096 |
+
| 2.4004 | 6726 | - | 1.0396 | 1.5587 | 1.5077 | - |
|
1097 |
+
| 2.4069 | 6744 | 1.9239 | - | - | - | - |
|
1098 |
+
| 2.5071 | 7025 | 2.0148 | - | - | - | - |
|
1099 |
+
| 2.6074 | 7306 | 1.9629 | - | - | - | - |
|
1100 |
+
| 2.7077 | 7587 | 1.7316 | - | - | - | - |
|
1101 |
+
| 2.8005 | 7847 | - | 1.0507 | 1.3294 | 1.4039 | - |
|
1102 |
+
| 2.8080 | 7868 | 1.7794 | - | - | - | - |
|
1103 |
+
| 2.9083 | 8149 | 1.7029 | - | - | - | - |
|
1104 |
+
| 3.0086 | 8430 | 1.7996 | - | - | - | - |
|
1105 |
+
| 3.1089 | 8711 | 1.9379 | - | - | - | - |
|
1106 |
+
| 3.2006 | 8968 | - | 0.9949 | 1.3678 | 1.3436 | - |
|
1107 |
+
| 3.2091 | 8992 | 1.844 | - | - | - | - |
|
1108 |
+
| 3.3094 | 9273 | 1.358 | - | - | - | - |
|
1109 |
+
| 3.4097 | 9554 | 1.5104 | - | - | - | - |
|
1110 |
+
| 3.5100 | 9835 | 1.6964 | - | - | - | - |
|
1111 |
+
| 3.6006 | 10089 | - | 0.9538 | 1.1866 | 1.3098 | - |
|
1112 |
+
| 3.6103 | 10116 | 1.7661 | - | - | - | - |
|
1113 |
+
| 3.7106 | 10397 | 1.6529 | - | - | - | - |
|
1114 |
+
| 3.8108 | 10678 | 1.6835 | - | - | - | - |
|
1115 |
+
| 3.9111 | 10959 | 1.35 | - | - | - | - |
|
1116 |
+
| 4.0 | 11208 | - | - | - | - | 0.5551 |
|
1117 |
|
1118 |
|
1119 |
### Framework Versions
|
pytorch_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:eb7927d6446f814065535e12fc14d9042e73ca86dc4ebdd235b5668414cc9613
|
3 |
+
size 451824288
|