richie-ghost commited on
Commit
bf92c62
1 Parent(s): c993e88

Add SetFit model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 1024,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,527 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: setfit
3
+ tags:
4
+ - setfit
5
+ - sentence-transformers
6
+ - text-classification
7
+ - generated_from_setfit_trainer
8
+ base_model: FacebookAI/roberta-large
9
+ metrics:
10
+ - accuracy
11
+ widget:
12
+ - text: Just checking in, how have you been feeling since our last chat?
13
+ - text: I’m looking forward to learning more from you.
14
+ - text: Take it easy!
15
+ - text: It was great seeing you. Let's catch up again soon!
16
+ - text: Let’s make sure you’re not carrying too much; how are you?
17
+ pipeline_tag: text-classification
18
+ inference: true
19
+ model-index:
20
+ - name: SetFit with FacebookAI/roberta-large
21
+ results:
22
+ - task:
23
+ type: text-classification
24
+ name: Text Classification
25
+ dataset:
26
+ name: Unknown
27
+ type: unknown
28
+ split: test
29
+ metrics:
30
+ - type: accuracy
31
+ value: 0.96
32
+ name: Accuracy
33
+ ---
34
+
35
+ # SetFit with FacebookAI/roberta-large
36
+
37
+ This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [FacebookAI/roberta-large](https://huggingface.co/FacebookAI/roberta-large) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
38
+
39
+ The model has been trained using an efficient few-shot learning technique that involves:
40
+
41
+ 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
42
+ 2. Training a classification head with features from the fine-tuned Sentence Transformer.
43
+
44
+ ## Model Details
45
+
46
+ ### Model Description
47
+ - **Model Type:** SetFit
48
+ - **Sentence Transformer body:** [FacebookAI/roberta-large](https://huggingface.co/FacebookAI/roberta-large)
49
+ - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
50
+ - **Maximum Sequence Length:** 512 tokens
51
+ - **Number of Classes:** 2 classes
52
+ <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
53
+ <!-- - **Language:** Unknown -->
54
+ <!-- - **License:** Unknown -->
55
+
56
+ ### Model Sources
57
+
58
+ - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
59
+ - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
60
+ - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
61
+
62
+ ### Model Labels
63
+ | Label | Examples |
64
+ |:------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
65
+ | true | <ul><li>'See you soon!'</li><li>'You look well!'</li><li>'Your journey is quite inspiring, can you share more about it?'</li></ul> |
66
+ | false | <ul><li>'What are the core components of your business model?'</li><li>'How do you balance your personal and professional life?'</li><li>"There is a situation where a daughter of a narcissistic mother denigrated the father. When the mother complained to the daughter about the father and how poor he was a a husband and person and how badly he treated the wife. The mother's claims were inaccurate and overblown. The mother said I inappropriate things to the daughter such as he flirted with other women, or the mother could have done much better than marrying him. After such episodes, the daughter was dismissive and rude to the father. What are the signs of parental alienation and what are the impacts on a daughter growing up and as an adult?"</li></ul> |
67
+
68
+ ## Evaluation
69
+
70
+ ### Metrics
71
+ | Label | Accuracy |
72
+ |:--------|:---------|
73
+ | **all** | 0.96 |
74
+
75
+ ## Uses
76
+
77
+ ### Direct Use for Inference
78
+
79
+ First install the SetFit library:
80
+
81
+ ```bash
82
+ pip install setfit
83
+ ```
84
+
85
+ Then you can load this model and run inference.
86
+
87
+ ```python
88
+ from setfit import SetFitModel
89
+
90
+ # Download from the 🤗 Hub
91
+ model = SetFitModel.from_pretrained("richie-ghost/setfit-FacebookAI-roberta-large-phatic")
92
+ # Run inference
93
+ preds = model("Take it easy!")
94
+ ```
95
+
96
+ <!--
97
+ ### Downstream Use
98
+
99
+ *List how someone could finetune this model on their own dataset.*
100
+ -->
101
+
102
+ <!--
103
+ ### Out-of-Scope Use
104
+
105
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
106
+ -->
107
+
108
+ <!--
109
+ ## Bias, Risks and Limitations
110
+
111
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
112
+ -->
113
+
114
+ <!--
115
+ ### Recommendations
116
+
117
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
118
+ -->
119
+
120
+ ## Training Details
121
+
122
+ ### Training Set Metrics
123
+ | Training set | Min | Median | Max |
124
+ |:-------------|:----|:-------|:----|
125
+ | Word count | 1 | 9.8722 | 108 |
126
+
127
+ | Label | Training Sample Count |
128
+ |:------|:----------------------|
129
+ | false | 191 |
130
+ | true | 169 |
131
+
132
+ ### Training Hyperparameters
133
+ - batch_size: (16, 16)
134
+ - num_epochs: (4, 4)
135
+ - max_steps: -1
136
+ - sampling_strategy: oversampling
137
+ - body_learning_rate: (2e-05, 1e-05)
138
+ - head_learning_rate: 0.01
139
+ - loss: CosineSimilarityLoss
140
+ - distance_metric: cosine_distance
141
+ - margin: 0.25
142
+ - end_to_end: False
143
+ - use_amp: False
144
+ - warmup_proportion: 0.1
145
+ - seed: 42
146
+ - eval_max_steps: -1
147
+ - load_best_model_at_end: True
148
+
149
+ ### Training Results
150
+ | Epoch | Step | Training Loss | Validation Loss |
151
+ |:-------:|:--------:|:-------------:|:---------------:|
152
+ | 0.0002 | 1 | 0.4745 | - |
153
+ | 0.0122 | 50 | 0.441 | - |
154
+ | 0.0245 | 100 | 0.4422 | - |
155
+ | 0.0367 | 150 | 0.2339 | - |
156
+ | 0.0489 | 200 | 0.1182 | - |
157
+ | 0.0612 | 250 | 0.0806 | - |
158
+ | 0.0734 | 300 | 0.1183 | - |
159
+ | 0.0856 | 350 | 0.0551 | - |
160
+ | 0.0978 | 400 | 0.0146 | - |
161
+ | 0.1101 | 450 | 0.0115 | - |
162
+ | 0.1223 | 500 | 0.0042 | - |
163
+ | 0.1345 | 550 | 0.0053 | - |
164
+ | 0.1468 | 600 | 0.0021 | - |
165
+ | 0.1590 | 650 | 0.0596 | - |
166
+ | 0.1712 | 700 | 0.0029 | - |
167
+ | 0.1835 | 750 | 0.0009 | - |
168
+ | 0.1957 | 800 | 0.0002 | - |
169
+ | 0.2079 | 850 | 0.0005 | - |
170
+ | 0.2202 | 900 | 0.0013 | - |
171
+ | 0.2324 | 950 | 0.0008 | - |
172
+ | 0.2446 | 1000 | 0.0004 | - |
173
+ | 0.2568 | 1050 | 0.0004 | - |
174
+ | 0.2691 | 1100 | 0.0004 | - |
175
+ | 0.2813 | 1150 | 0.0003 | - |
176
+ | 0.2935 | 1200 | 0.0003 | - |
177
+ | 0.3058 | 1250 | 0.0012 | - |
178
+ | 0.3180 | 1300 | 0.0001 | - |
179
+ | 0.3302 | 1350 | 0.0002 | - |
180
+ | 0.3425 | 1400 | 0.0003 | - |
181
+ | 0.3547 | 1450 | 0.0024 | - |
182
+ | 0.3669 | 1500 | 0.0008 | - |
183
+ | 0.3792 | 1550 | 0.0015 | - |
184
+ | 0.3914 | 1600 | 0.0002 | - |
185
+ | 0.4036 | 1650 | 0.0002 | - |
186
+ | 0.4159 | 1700 | 0.1842 | - |
187
+ | 0.4281 | 1750 | 0.0009 | - |
188
+ | 0.4403 | 1800 | 0.0001 | - |
189
+ | 0.4525 | 1850 | 0.0013 | - |
190
+ | 0.4648 | 1900 | 0.0637 | - |
191
+ | 0.4770 | 1950 | 0.0002 | - |
192
+ | 0.4892 | 2000 | 0.0007 | - |
193
+ | 0.5015 | 2050 | 0.0001 | - |
194
+ | 0.5137 | 2100 | 0.0 | - |
195
+ | 0.5259 | 2150 | 0.0 | - |
196
+ | 0.5382 | 2200 | 0.0 | - |
197
+ | 0.5504 | 2250 | 0.0 | - |
198
+ | 0.5626 | 2300 | 0.0001 | - |
199
+ | 0.5749 | 2350 | 0.0 | - |
200
+ | 0.5871 | 2400 | 0.0 | - |
201
+ | 0.5993 | 2450 | 0.0 | - |
202
+ | 0.6115 | 2500 | 0.0 | - |
203
+ | 0.6238 | 2550 | 0.0 | - |
204
+ | 0.6360 | 2600 | 0.0 | - |
205
+ | 0.6482 | 2650 | 0.0 | - |
206
+ | 0.6605 | 2700 | 0.0001 | - |
207
+ | 0.6727 | 2750 | 0.0 | - |
208
+ | 0.6849 | 2800 | 0.0 | - |
209
+ | 0.6972 | 2850 | 0.0 | - |
210
+ | 0.7094 | 2900 | 0.0 | - |
211
+ | 0.7216 | 2950 | 0.0 | - |
212
+ | 0.7339 | 3000 | 0.0 | - |
213
+ | 0.7461 | 3050 | 0.0 | - |
214
+ | 0.7583 | 3100 | 0.0001 | - |
215
+ | 0.7705 | 3150 | 0.0 | - |
216
+ | 0.7828 | 3200 | 0.0 | - |
217
+ | 0.7950 | 3250 | 0.0 | - |
218
+ | 0.8072 | 3300 | 0.0 | - |
219
+ | 0.8195 | 3350 | 0.0 | - |
220
+ | 0.8317 | 3400 | 0.0 | - |
221
+ | 0.8439 | 3450 | 0.0001 | - |
222
+ | 0.8562 | 3500 | 0.0 | - |
223
+ | 0.8684 | 3550 | 0.0 | - |
224
+ | 0.8806 | 3600 | 0.0 | - |
225
+ | 0.8929 | 3650 | 0.0 | - |
226
+ | 0.9051 | 3700 | 0.0 | - |
227
+ | 0.9173 | 3750 | 0.0 | - |
228
+ | 0.9295 | 3800 | 0.0 | - |
229
+ | 0.9418 | 3850 | 0.0 | - |
230
+ | 0.9540 | 3900 | 0.0 | - |
231
+ | 0.9662 | 3950 | 0.0 | - |
232
+ | 0.9785 | 4000 | 0.0 | - |
233
+ | 0.9907 | 4050 | 0.0 | - |
234
+ | **1.0** | **4088** | **-** | **0.0815** |
235
+ | 1.0029 | 4100 | 0.0 | - |
236
+ | 1.0152 | 4150 | 0.0 | - |
237
+ | 1.0274 | 4200 | 0.0 | - |
238
+ | 1.0396 | 4250 | 0.0 | - |
239
+ | 1.0519 | 4300 | 0.0 | - |
240
+ | 1.0641 | 4350 | 0.0 | - |
241
+ | 1.0763 | 4400 | 0.0 | - |
242
+ | 1.0886 | 4450 | 0.0 | - |
243
+ | 1.1008 | 4500 | 0.0 | - |
244
+ | 1.1130 | 4550 | 0.0 | - |
245
+ | 1.1252 | 4600 | 0.0 | - |
246
+ | 1.1375 | 4650 | 0.0 | - |
247
+ | 1.1497 | 4700 | 0.0 | - |
248
+ | 1.1619 | 4750 | 0.0 | - |
249
+ | 1.1742 | 4800 | 0.0 | - |
250
+ | 1.1864 | 4850 | 0.0 | - |
251
+ | 1.1986 | 4900 | 0.0 | - |
252
+ | 1.2109 | 4950 | 0.0 | - |
253
+ | 1.2231 | 5000 | 0.0 | - |
254
+ | 1.2353 | 5050 | 0.0 | - |
255
+ | 1.2476 | 5100 | 0.0 | - |
256
+ | 1.2598 | 5150 | 0.0 | - |
257
+ | 1.2720 | 5200 | 0.0 | - |
258
+ | 1.2842 | 5250 | 0.0 | - |
259
+ | 1.2965 | 5300 | 0.0 | - |
260
+ | 1.3087 | 5350 | 0.0 | - |
261
+ | 1.3209 | 5400 | 0.0 | - |
262
+ | 1.3332 | 5450 | 0.0 | - |
263
+ | 1.3454 | 5500 | 0.0 | - |
264
+ | 1.3576 | 5550 | 0.0 | - |
265
+ | 1.3699 | 5600 | 0.0 | - |
266
+ | 1.3821 | 5650 | 0.0 | - |
267
+ | 1.3943 | 5700 | 0.0 | - |
268
+ | 1.4066 | 5750 | 0.0 | - |
269
+ | 1.4188 | 5800 | 0.0 | - |
270
+ | 1.4310 | 5850 | 0.0 | - |
271
+ | 1.4432 | 5900 | 0.0 | - |
272
+ | 1.4555 | 5950 | 0.0 | - |
273
+ | 1.4677 | 6000 | 0.0 | - |
274
+ | 1.4799 | 6050 | 0.0 | - |
275
+ | 1.4922 | 6100 | 0.0 | - |
276
+ | 1.5044 | 6150 | 0.0112 | - |
277
+ | 1.5166 | 6200 | 0.4712 | - |
278
+ | 1.5289 | 6250 | 0.3977 | - |
279
+ | 1.5411 | 6300 | 0.2112 | - |
280
+ | 1.5533 | 6350 | 0.318 | - |
281
+ | 1.5656 | 6400 | 0.2523 | - |
282
+ | 1.5778 | 6450 | 0.2829 | - |
283
+ | 1.5900 | 6500 | 0.2736 | - |
284
+ | 1.6023 | 6550 | 0.2493 | - |
285
+ | 1.6145 | 6600 | 0.3112 | - |
286
+ | 1.6267 | 6650 | 0.2291 | - |
287
+ | 1.6389 | 6700 | 0.2855 | - |
288
+ | 1.6512 | 6750 | 0.2642 | - |
289
+ | 1.6634 | 6800 | 0.2376 | - |
290
+ | 1.6756 | 6850 | 0.2983 | - |
291
+ | 1.6879 | 6900 | 0.2853 | - |
292
+ | 1.7001 | 6950 | 0.3095 | - |
293
+ | 1.7123 | 7000 | 0.2497 | - |
294
+ | 1.7246 | 7050 | 0.2305 | - |
295
+ | 1.7368 | 7100 | 0.2433 | - |
296
+ | 1.7490 | 7150 | 0.2505 | - |
297
+ | 1.7613 | 7200 | 0.2292 | - |
298
+ | 1.7735 | 7250 | 0.3028 | - |
299
+ | 1.7857 | 7300 | 0.2394 | - |
300
+ | 1.7979 | 7350 | 0.2601 | - |
301
+ | 1.8102 | 7400 | 0.2417 | - |
302
+ | 1.8224 | 7450 | 0.2086 | - |
303
+ | 1.8346 | 7500 | 0.2573 | - |
304
+ | 1.8469 | 7550 | 0.2344 | - |
305
+ | 1.8591 | 7600 | 0.2381 | - |
306
+ | 1.8713 | 7650 | 0.2772 | - |
307
+ | 1.8836 | 7700 | 0.2614 | - |
308
+ | 1.8958 | 7750 | 0.2659 | - |
309
+ | 1.9080 | 7800 | 0.2536 | - |
310
+ | 1.9203 | 7850 | 0.2385 | - |
311
+ | 1.9325 | 7900 | 0.2695 | - |
312
+ | 1.9447 | 7950 | 0.2512 | - |
313
+ | 1.9569 | 8000 | 0.2216 | - |
314
+ | 1.9692 | 8050 | 0.2291 | - |
315
+ | 1.9814 | 8100 | 0.2443 | - |
316
+ | 1.9936 | 8150 | 0.2579 | - |
317
+ | 2.0 | 8176 | - | 0.5 |
318
+ | 2.0059 | 8200 | 0.2605 | - |
319
+ | 2.0181 | 8250 | 0.2528 | - |
320
+ | 2.0303 | 8300 | 0.2361 | - |
321
+ | 2.0426 | 8350 | 0.2891 | - |
322
+ | 2.0548 | 8400 | 0.2692 | - |
323
+ | 2.0670 | 8450 | 0.25 | - |
324
+ | 2.0793 | 8500 | 0.2362 | - |
325
+ | 2.0915 | 8550 | 0.2833 | - |
326
+ | 2.1037 | 8600 | 0.2698 | - |
327
+ | 2.1159 | 8650 | 0.2195 | - |
328
+ | 2.1282 | 8700 | 0.2621 | - |
329
+ | 2.1404 | 8750 | 0.2564 | - |
330
+ | 2.1526 | 8800 | 0.2657 | - |
331
+ | 2.1649 | 8850 | 0.2629 | - |
332
+ | 2.1771 | 8900 | 0.2503 | - |
333
+ | 2.1893 | 8950 | 0.2583 | - |
334
+ | 2.2016 | 9000 | 0.2694 | - |
335
+ | 2.2138 | 9050 | 0.2824 | - |
336
+ | 2.2260 | 9100 | 0.2675 | - |
337
+ | 2.2383 | 9150 | 0.2699 | - |
338
+ | 2.2505 | 9200 | 0.2515 | - |
339
+ | 2.2627 | 9250 | 0.2511 | - |
340
+ | 2.2750 | 9300 | 0.2518 | - |
341
+ | 2.2872 | 9350 | 0.2555 | - |
342
+ | 2.2994 | 9400 | 0.2512 | - |
343
+ | 2.3116 | 9450 | 0.2374 | - |
344
+ | 2.3239 | 9500 | 0.2546 | - |
345
+ | 2.3361 | 9550 | 0.2846 | - |
346
+ | 2.3483 | 9600 | 0.2617 | - |
347
+ | 2.3606 | 9650 | 0.2474 | - |
348
+ | 2.3728 | 9700 | 0.2454 | - |
349
+ | 2.3850 | 9750 | 0.2265 | - |
350
+ | 2.3973 | 9800 | 0.2272 | - |
351
+ | 2.4095 | 9850 | 0.2442 | - |
352
+ | 2.4217 | 9900 | 0.236 | - |
353
+ | 2.4340 | 9950 | 0.2382 | - |
354
+ | 2.4462 | 10000 | 0.2645 | - |
355
+ | 2.4584 | 10050 | 0.2707 | - |
356
+ | 2.4706 | 10100 | 0.2573 | - |
357
+ | 2.4829 | 10150 | 0.2435 | - |
358
+ | 2.4951 | 10200 | 0.2705 | - |
359
+ | 2.5073 | 10250 | 0.2808 | - |
360
+ | 2.5196 | 10300 | 0.2581 | - |
361
+ | 2.5318 | 10350 | 0.2544 | - |
362
+ | 2.5440 | 10400 | 0.2333 | - |
363
+ | 2.5563 | 10450 | 0.2544 | - |
364
+ | 2.5685 | 10500 | 0.2497 | - |
365
+ | 2.5807 | 10550 | 0.2575 | - |
366
+ | 2.5930 | 10600 | 0.2382 | - |
367
+ | 2.6052 | 10650 | 0.2451 | - |
368
+ | 2.6174 | 10700 | 0.2702 | - |
369
+ | 2.6296 | 10750 | 0.2569 | - |
370
+ | 2.6419 | 10800 | 0.249 | - |
371
+ | 2.6541 | 10850 | 0.2366 | - |
372
+ | 2.6663 | 10900 | 0.2278 | - |
373
+ | 2.6786 | 10950 | 0.2568 | - |
374
+ | 2.6908 | 11000 | 0.2721 | - |
375
+ | 2.7030 | 11050 | 0.2593 | - |
376
+ | 2.7153 | 11100 | 0.2439 | - |
377
+ | 2.7275 | 11150 | 0.2543 | - |
378
+ | 2.7397 | 11200 | 0.2478 | - |
379
+ | 2.7520 | 11250 | 0.2325 | - |
380
+ | 2.7642 | 11300 | 0.2538 | - |
381
+ | 2.7764 | 11350 | 0.2968 | - |
382
+ | 2.7886 | 11400 | 0.2505 | - |
383
+ | 2.8009 | 11450 | 0.2377 | - |
384
+ | 2.8131 | 11500 | 0.2547 | - |
385
+ | 2.8253 | 11550 | 0.2529 | - |
386
+ | 2.8376 | 11600 | 0.2502 | - |
387
+ | 2.8498 | 11650 | 0.2293 | - |
388
+ | 2.8620 | 11700 | 0.2676 | - |
389
+ | 2.8743 | 11750 | 0.2371 | - |
390
+ | 2.8865 | 11800 | 0.2495 | - |
391
+ | 2.8987 | 11850 | 0.2937 | - |
392
+ | 2.9110 | 11900 | 0.2355 | - |
393
+ | 2.9232 | 11950 | 0.2482 | - |
394
+ | 2.9354 | 12000 | 0.2336 | - |
395
+ | 2.9477 | 12050 | 0.2344 | - |
396
+ | 2.9599 | 12100 | 0.257 | - |
397
+ | 2.9721 | 12150 | 0.2557 | - |
398
+ | 2.9843 | 12200 | 0.2854 | - |
399
+ | 2.9966 | 12250 | 0.2455 | - |
400
+ | 3.0 | 12264 | - | 0.5 |
401
+ | 3.0088 | 12300 | 0.2323 | - |
402
+ | 3.0210 | 12350 | 0.2566 | - |
403
+ | 3.0333 | 12400 | 0.2319 | - |
404
+ | 3.0455 | 12450 | 0.2552 | - |
405
+ | 3.0577 | 12500 | 0.2796 | - |
406
+ | 3.0700 | 12550 | 0.2823 | - |
407
+ | 3.0822 | 12600 | 0.2303 | - |
408
+ | 3.0944 | 12650 | 0.2448 | - |
409
+ | 3.1067 | 12700 | 0.2502 | - |
410
+ | 3.1189 | 12750 | 0.2516 | - |
411
+ | 3.1311 | 12800 | 0.2537 | - |
412
+ | 3.1433 | 12850 | 0.251 | - |
413
+ | 3.1556 | 12900 | 0.2639 | - |
414
+ | 3.1678 | 12950 | 0.2321 | - |
415
+ | 3.1800 | 13000 | 0.282 | - |
416
+ | 3.1923 | 13050 | 0.2577 | - |
417
+ | 3.2045 | 13100 | 0.2448 | - |
418
+ | 3.2167 | 13150 | 0.2352 | - |
419
+ | 3.2290 | 13200 | 0.281 | - |
420
+ | 3.2412 | 13250 | 0.2337 | - |
421
+ | 3.2534 | 13300 | 0.268 | - |
422
+ | 3.2657 | 13350 | 0.261 | - |
423
+ | 3.2779 | 13400 | 0.2378 | - |
424
+ | 3.2901 | 13450 | 0.2588 | - |
425
+ | 3.3023 | 13500 | 0.266 | - |
426
+ | 3.3146 | 13550 | 0.2604 | - |
427
+ | 3.3268 | 13600 | 0.2202 | - |
428
+ | 3.3390 | 13650 | 0.2217 | - |
429
+ | 3.3513 | 13700 | 0.2464 | - |
430
+ | 3.3635 | 13750 | 0.2684 | - |
431
+ | 3.3757 | 13800 | 0.2279 | - |
432
+ | 3.3880 | 13850 | 0.2379 | - |
433
+ | 3.4002 | 13900 | 0.2741 | - |
434
+ | 3.4124 | 13950 | 0.2713 | - |
435
+ | 3.4247 | 14000 | 0.2581 | - |
436
+ | 3.4369 | 14050 | 0.2638 | - |
437
+ | 3.4491 | 14100 | 0.2125 | - |
438
+ | 3.4614 | 14150 | 0.2348 | - |
439
+ | 3.4736 | 14200 | 0.2253 | - |
440
+ | 3.4858 | 14250 | 0.2627 | - |
441
+ | 3.4980 | 14300 | 0.2463 | - |
442
+ | 3.5103 | 14350 | 0.2533 | - |
443
+ | 3.5225 | 14400 | 0.2422 | - |
444
+ | 3.5347 | 14450 | 0.2296 | - |
445
+ | 3.5470 | 14500 | 0.2532 | - |
446
+ | 3.5592 | 14550 | 0.2733 | - |
447
+ | 3.5714 | 14600 | 0.2258 | - |
448
+ | 3.5837 | 14650 | 0.2253 | - |
449
+ | 3.5959 | 14700 | 0.2388 | - |
450
+ | 3.6081 | 14750 | 0.2217 | - |
451
+ | 3.6204 | 14800 | 0.3033 | - |
452
+ | 3.6326 | 14850 | 0.2349 | - |
453
+ | 3.6448 | 14900 | 0.2596 | - |
454
+ | 3.6570 | 14950 | 0.2415 | - |
455
+ | 3.6693 | 15000 | 0.2494 | - |
456
+ | 3.6815 | 15050 | 0.2826 | - |
457
+ | 3.6937 | 15100 | 0.2633 | - |
458
+ | 3.7060 | 15150 | 0.2636 | - |
459
+ | 3.7182 | 15200 | 0.2351 | - |
460
+ | 3.7304 | 15250 | 0.264 | - |
461
+ | 3.7427 | 15300 | 0.2652 | - |
462
+ | 3.7549 | 15350 | 0.2724 | - |
463
+ | 3.7671 | 15400 | 0.2731 | - |
464
+ | 3.7794 | 15450 | 0.2825 | - |
465
+ | 3.7916 | 15500 | 0.2611 | - |
466
+ | 3.8038 | 15550 | 0.2574 | - |
467
+ | 3.8160 | 15600 | 0.261 | - |
468
+ | 3.8283 | 15650 | 0.219 | - |
469
+ | 3.8405 | 15700 | 0.2323 | - |
470
+ | 3.8527 | 15750 | 0.2442 | - |
471
+ | 3.8650 | 15800 | 0.2509 | - |
472
+ | 3.8772 | 15850 | 0.26 | - |
473
+ | 3.8894 | 15900 | 0.2475 | - |
474
+ | 3.9017 | 15950 | 0.2452 | - |
475
+ | 3.9139 | 16000 | 0.2598 | - |
476
+ | 3.9261 | 16050 | 0.2377 | - |
477
+ | 3.9384 | 16100 | 0.2445 | - |
478
+ | 3.9506 | 16150 | 0.2451 | - |
479
+ | 3.9628 | 16200 | 0.2714 | - |
480
+ | 3.9750 | 16250 | 0.2755 | - |
481
+ | 3.9873 | 16300 | 0.2579 | - |
482
+ | 3.9995 | 16350 | 0.2338 | - |
483
+ | 4.0 | 16352 | - | 0.5 |
484
+
485
+ * The bold row denotes the saved checkpoint.
486
+ ### Framework Versions
487
+ - Python: 3.10.12
488
+ - SetFit: 1.0.3
489
+ - Sentence Transformers: 2.7.0
490
+ - Transformers: 4.40.0
491
+ - PyTorch: 2.2.1+cu121
492
+ - Datasets: 2.19.0
493
+ - Tokenizers: 0.19.1
494
+
495
+ ## Citation
496
+
497
+ ### BibTeX
498
+ ```bibtex
499
+ @article{https://doi.org/10.48550/arxiv.2209.11055,
500
+ doi = {10.48550/ARXIV.2209.11055},
501
+ url = {https://arxiv.org/abs/2209.11055},
502
+ author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
503
+ keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
504
+ title = {Efficient Few-Shot Learning Without Prompts},
505
+ publisher = {arXiv},
506
+ year = {2022},
507
+ copyright = {Creative Commons Attribution 4.0 International}
508
+ }
509
+ ```
510
+
511
+ <!--
512
+ ## Glossary
513
+
514
+ *Clearly define terms in order to be accessible across audiences.*
515
+ -->
516
+
517
+ <!--
518
+ ## Model Card Authors
519
+
520
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
521
+ -->
522
+
523
+ <!--
524
+ ## Model Card Contact
525
+
526
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
527
+ -->
config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "checkpoints/step_4088",
3
+ "architectures": [
4
+ "RobertaModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "hidden_act": "gelu",
11
+ "hidden_dropout_prob": 0.1,
12
+ "hidden_size": 1024,
13
+ "initializer_range": 0.02,
14
+ "intermediate_size": 4096,
15
+ "layer_norm_eps": 1e-05,
16
+ "max_position_embeddings": 514,
17
+ "model_type": "roberta",
18
+ "num_attention_heads": 16,
19
+ "num_hidden_layers": 24,
20
+ "pad_token_id": 1,
21
+ "position_embedding_type": "absolute",
22
+ "torch_dtype": "float32",
23
+ "transformers_version": "4.40.0",
24
+ "type_vocab_size": 1,
25
+ "use_cache": true,
26
+ "vocab_size": 50265
27
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "2.7.0",
4
+ "transformers": "4.40.0",
5
+ "pytorch": "2.2.1+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null
9
+ }
config_setfit.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "labels": [
3
+ "false",
4
+ "true"
5
+ ],
6
+ "normalize_embeddings": false
7
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c9cb25fa2b231b9bf4a76bd9bee3890518ae2f90410abcd16c7536c151cbc5f4
3
+ size 1421483904
model_head.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ba5201b33d7b0ed4ee07dec9dc5ff4d3b06399c8bc693612567205d36efb252c
3
+ size 9071
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": true,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": true,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,64 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<s>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "1": {
13
+ "content": "<pad>",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "2": {
21
+ "content": "</s>",
22
+ "lstrip": false,
23
+ "normalized": true,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "3": {
29
+ "content": "<unk>",
30
+ "lstrip": false,
31
+ "normalized": true,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "50264": {
37
+ "content": "<mask>",
38
+ "lstrip": true,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ }
44
+ },
45
+ "bos_token": "<s>",
46
+ "clean_up_tokenization_spaces": true,
47
+ "cls_token": "<s>",
48
+ "eos_token": "</s>",
49
+ "errors": "replace",
50
+ "mask_token": "<mask>",
51
+ "max_length": 512,
52
+ "model_max_length": 512,
53
+ "pad_to_multiple_of": null,
54
+ "pad_token": "<pad>",
55
+ "pad_token_type_id": 0,
56
+ "padding_side": "right",
57
+ "sep_token": "</s>",
58
+ "stride": 0,
59
+ "tokenizer_class": "RobertaTokenizer",
60
+ "trim_offsets": true,
61
+ "truncation_side": "right",
62
+ "truncation_strategy": "longest_first",
63
+ "unk_token": "<unk>"
64
+ }
vocab.json ADDED
The diff for this file is too large to render. See raw diff