Muennighoff commited on
Commit
2f6b75f
1 Parent(s): bd44305

Add MTEB metadata

Browse files
Files changed (1) hide show
  1. README.md +2204 -1
README.md CHANGED
@@ -4,6 +4,7 @@ tags:
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
 
7
  language: en
8
  license: apache-2.0
9
  datasets:
@@ -28,7 +29,2209 @@ datasets:
28
  - embedding-data/SPECTER
29
  - embedding-data/PAQ_pairs
30
  - embedding-data/WikiAnswers
31
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
32
  ---
33
 
34
 
 
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
7
+ - mteb
8
  language: en
9
  license: apache-2.0
10
  datasets:
 
29
  - embedding-data/SPECTER
30
  - embedding-data/PAQ_pairs
31
  - embedding-data/WikiAnswers
32
+ model-index:
33
+ - name: all-mpnet-base-v2
34
+ results:
35
+ - task:
36
+ type: Classification
37
+ dataset:
38
+ type: mteb/amazon_counterfactual
39
+ name: MTEB AmazonCounterfactualClassification (en)
40
+ config: en
41
+ split: test
42
+ metrics:
43
+ - type: accuracy
44
+ value: 65.26865671641791
45
+ - type: ap
46
+ value: 28.47453420428918
47
+ - type: f1
48
+ value: 59.3470101009448
49
+ - task:
50
+ type: Classification
51
+ dataset:
52
+ type: mteb/amazon_polarity
53
+ name: MTEB AmazonPolarityClassification
54
+ config: default
55
+ split: test
56
+ metrics:
57
+ - type: accuracy
58
+ value: 67.13145
59
+ - type: ap
60
+ value: 61.842060778903786
61
+ - type: f1
62
+ value: 66.79987305640383
63
+ - task:
64
+ type: Classification
65
+ dataset:
66
+ type: mteb/amazon_reviews_multi
67
+ name: MTEB AmazonReviewsClassification (en)
68
+ config: en
69
+ split: test
70
+ metrics:
71
+ - type: accuracy
72
+ value: 31.920000000000005
73
+ - type: f1
74
+ value: 31.2465193896153
75
+ - task:
76
+ type: Retrieval
77
+ dataset:
78
+ type: arguana
79
+ name: MTEB ArguAna
80
+ config: default
81
+ split: test
82
+ metrics:
83
+ - type: map_at_1
84
+ value: 23.186
85
+ - type: map_at_10
86
+ value: 37.692
87
+ - type: map_at_100
88
+ value: 38.986
89
+ - type: map_at_1000
90
+ value: 38.991
91
+ - type: map_at_3
92
+ value: 32.622
93
+ - type: map_at_5
94
+ value: 35.004999999999995
95
+ - type: ndcg_at_1
96
+ value: 23.186
97
+ - type: ndcg_at_10
98
+ value: 46.521
99
+ - type: ndcg_at_100
100
+ value: 51.954
101
+ - type: ndcg_at_1000
102
+ value: 52.087
103
+ - type: ndcg_at_3
104
+ value: 35.849
105
+ - type: ndcg_at_5
106
+ value: 40.12
107
+ - type: precision_at_1
108
+ value: 23.186
109
+ - type: precision_at_10
110
+ value: 7.510999999999999
111
+ - type: precision_at_100
112
+ value: 0.9860000000000001
113
+ - type: precision_at_1000
114
+ value: 0.1
115
+ - type: precision_at_3
116
+ value: 15.078
117
+ - type: precision_at_5
118
+ value: 11.110000000000001
119
+ - type: recall_at_1
120
+ value: 23.186
121
+ - type: recall_at_10
122
+ value: 75.107
123
+ - type: recall_at_100
124
+ value: 98.649
125
+ - type: recall_at_1000
126
+ value: 99.644
127
+ - type: recall_at_3
128
+ value: 45.235
129
+ - type: recall_at_5
130
+ value: 55.547999999999995
131
+ - task:
132
+ type: Clustering
133
+ dataset:
134
+ type: mteb/arxiv-clustering-p2p
135
+ name: MTEB ArxivClusteringP2P
136
+ config: default
137
+ split: test
138
+ metrics:
139
+ - type: v_measure
140
+ value: 48.37886340922374
141
+ - task:
142
+ type: Clustering
143
+ dataset:
144
+ type: mteb/arxiv-clustering-s2s
145
+ name: MTEB ArxivClusteringS2S
146
+ config: default
147
+ split: test
148
+ metrics:
149
+ - type: v_measure
150
+ value: 39.72488615315985
151
+ - task:
152
+ type: Reranking
153
+ dataset:
154
+ type: mteb/askubuntudupquestions-reranking
155
+ name: MTEB AskUbuntuDupQuestions
156
+ config: default
157
+ split: test
158
+ metrics:
159
+ - type: map
160
+ value: 65.85199009344481
161
+ - type: mrr
162
+ value: 78.47700391329201
163
+ - task:
164
+ type: STS
165
+ dataset:
166
+ type: mteb/biosses-sts
167
+ name: MTEB BIOSSES
168
+ config: default
169
+ split: test
170
+ metrics:
171
+ - type: cos_sim_pearson
172
+ value: 84.47737119217858
173
+ - type: cos_sim_spearman
174
+ value: 80.43195317854409
175
+ - type: euclidean_pearson
176
+ value: 82.20496332547978
177
+ - type: euclidean_spearman
178
+ value: 80.43195317854409
179
+ - type: manhattan_pearson
180
+ value: 81.4836610720397
181
+ - type: manhattan_spearman
182
+ value: 79.65904400101908
183
+ - task:
184
+ type: Classification
185
+ dataset:
186
+ type: mteb/banking77
187
+ name: MTEB Banking77Classification
188
+ config: default
189
+ split: test
190
+ metrics:
191
+ - type: accuracy
192
+ value: 81.8603896103896
193
+ - type: f1
194
+ value: 81.28027245637479
195
+ - task:
196
+ type: Clustering
197
+ dataset:
198
+ type: mteb/biorxiv-clustering-p2p
199
+ name: MTEB BiorxivClusteringP2P
200
+ config: default
201
+ split: test
202
+ metrics:
203
+ - type: v_measure
204
+ value: 39.616605133625185
205
+ - task:
206
+ type: Clustering
207
+ dataset:
208
+ type: mteb/biorxiv-clustering-s2s
209
+ name: MTEB BiorxivClusteringS2S
210
+ config: default
211
+ split: test
212
+ metrics:
213
+ - type: v_measure
214
+ value: 35.02442407186902
215
+ - task:
216
+ type: Retrieval
217
+ dataset:
218
+ type: BeIR/cqadupstack
219
+ name: MTEB CQADupstackAndroidRetrieval
220
+ config: default
221
+ split: test
222
+ metrics:
223
+ - type: map_at_1
224
+ value: 36.036
225
+ - type: map_at_10
226
+ value: 49.302
227
+ - type: map_at_100
228
+ value: 50.956
229
+ - type: map_at_1000
230
+ value: 51.080000000000005
231
+ - type: map_at_3
232
+ value: 45.237
233
+ - type: map_at_5
234
+ value: 47.353
235
+ - type: ndcg_at_1
236
+ value: 45.207
237
+ - type: ndcg_at_10
238
+ value: 56.485
239
+ - type: ndcg_at_100
240
+ value: 61.413
241
+ - type: ndcg_at_1000
242
+ value: 62.870000000000005
243
+ - type: ndcg_at_3
244
+ value: 51.346000000000004
245
+ - type: ndcg_at_5
246
+ value: 53.486
247
+ - type: precision_at_1
248
+ value: 45.207
249
+ - type: precision_at_10
250
+ value: 11.144
251
+ - type: precision_at_100
252
+ value: 1.735
253
+ - type: precision_at_1000
254
+ value: 0.22100000000000003
255
+ - type: precision_at_3
256
+ value: 24.94
257
+ - type: precision_at_5
258
+ value: 17.997
259
+ - type: recall_at_1
260
+ value: 36.036
261
+ - type: recall_at_10
262
+ value: 69.191
263
+ - type: recall_at_100
264
+ value: 89.423
265
+ - type: recall_at_1000
266
+ value: 98.425
267
+ - type: recall_at_3
268
+ value: 53.849999999999994
269
+ - type: recall_at_5
270
+ value: 60.107
271
+ - task:
272
+ type: Retrieval
273
+ dataset:
274
+ type: BeIR/cqadupstack
275
+ name: MTEB CQADupstackEnglishRetrieval
276
+ config: default
277
+ split: test
278
+ metrics:
279
+ - type: map_at_1
280
+ value: 32.92
281
+ - type: map_at_10
282
+ value: 45.739999999999995
283
+ - type: map_at_100
284
+ value: 47.309
285
+ - type: map_at_1000
286
+ value: 47.443000000000005
287
+ - type: map_at_3
288
+ value: 42.154
289
+ - type: map_at_5
290
+ value: 44.207
291
+ - type: ndcg_at_1
292
+ value: 42.229
293
+ - type: ndcg_at_10
294
+ value: 52.288999999999994
295
+ - type: ndcg_at_100
296
+ value: 57.04900000000001
297
+ - type: ndcg_at_1000
298
+ value: 58.788
299
+ - type: ndcg_at_3
300
+ value: 47.531
301
+ - type: ndcg_at_5
302
+ value: 49.861
303
+ - type: precision_at_1
304
+ value: 42.229
305
+ - type: precision_at_10
306
+ value: 10.299
307
+ - type: precision_at_100
308
+ value: 1.68
309
+ - type: precision_at_1000
310
+ value: 0.213
311
+ - type: precision_at_3
312
+ value: 23.673
313
+ - type: precision_at_5
314
+ value: 17.006
315
+ - type: recall_at_1
316
+ value: 32.92
317
+ - type: recall_at_10
318
+ value: 63.865
319
+ - type: recall_at_100
320
+ value: 84.06700000000001
321
+ - type: recall_at_1000
322
+ value: 94.536
323
+ - type: recall_at_3
324
+ value: 49.643
325
+ - type: recall_at_5
326
+ value: 56.119
327
+ - task:
328
+ type: Retrieval
329
+ dataset:
330
+ type: BeIR/cqadupstack
331
+ name: MTEB CQADupstackGamingRetrieval
332
+ config: default
333
+ split: test
334
+ metrics:
335
+ - type: map_at_1
336
+ value: 40.695
337
+ - type: map_at_10
338
+ value: 53.787
339
+ - type: map_at_100
340
+ value: 54.778000000000006
341
+ - type: map_at_1000
342
+ value: 54.827000000000005
343
+ - type: map_at_3
344
+ value: 50.151999999999994
345
+ - type: map_at_5
346
+ value: 52.207
347
+ - type: ndcg_at_1
348
+ value: 46.52
349
+ - type: ndcg_at_10
350
+ value: 60.026
351
+ - type: ndcg_at_100
352
+ value: 63.81099999999999
353
+ - type: ndcg_at_1000
354
+ value: 64.741
355
+ - type: ndcg_at_3
356
+ value: 53.83
357
+ - type: ndcg_at_5
358
+ value: 56.928999999999995
359
+ - type: precision_at_1
360
+ value: 46.52
361
+ - type: precision_at_10
362
+ value: 9.754999999999999
363
+ - type: precision_at_100
364
+ value: 1.2670000000000001
365
+ - type: precision_at_1000
366
+ value: 0.13799999999999998
367
+ - type: precision_at_3
368
+ value: 24.096
369
+ - type: precision_at_5
370
+ value: 16.689999999999998
371
+ - type: recall_at_1
372
+ value: 40.695
373
+ - type: recall_at_10
374
+ value: 75.181
375
+ - type: recall_at_100
376
+ value: 91.479
377
+ - type: recall_at_1000
378
+ value: 98.06899999999999
379
+ - type: recall_at_3
380
+ value: 58.707
381
+ - type: recall_at_5
382
+ value: 66.295
383
+ - task:
384
+ type: Retrieval
385
+ dataset:
386
+ type: BeIR/cqadupstack
387
+ name: MTEB CQADupstackGisRetrieval
388
+ config: default
389
+ split: test
390
+ metrics:
391
+ - type: map_at_1
392
+ value: 29.024
393
+ - type: map_at_10
394
+ value: 38.438
395
+ - type: map_at_100
396
+ value: 39.576
397
+ - type: map_at_1000
398
+ value: 39.645
399
+ - type: map_at_3
400
+ value: 34.827999999999996
401
+ - type: map_at_5
402
+ value: 36.947
403
+ - type: ndcg_at_1
404
+ value: 31.299
405
+ - type: ndcg_at_10
406
+ value: 44.268
407
+ - type: ndcg_at_100
408
+ value: 49.507
409
+ - type: ndcg_at_1000
410
+ value: 51.205999999999996
411
+ - type: ndcg_at_3
412
+ value: 37.248999999999995
413
+ - type: ndcg_at_5
414
+ value: 40.861999999999995
415
+ - type: precision_at_1
416
+ value: 31.299
417
+ - type: precision_at_10
418
+ value: 6.949
419
+ - type: precision_at_100
420
+ value: 1.012
421
+ - type: precision_at_1000
422
+ value: 0.11900000000000001
423
+ - type: precision_at_3
424
+ value: 15.518
425
+ - type: precision_at_5
426
+ value: 11.366999999999999
427
+ - type: recall_at_1
428
+ value: 29.024
429
+ - type: recall_at_10
430
+ value: 60.404
431
+ - type: recall_at_100
432
+ value: 83.729
433
+ - type: recall_at_1000
434
+ value: 96.439
435
+ - type: recall_at_3
436
+ value: 41.65
437
+ - type: recall_at_5
438
+ value: 50.263999999999996
439
+ - task:
440
+ type: Retrieval
441
+ dataset:
442
+ type: BeIR/cqadupstack
443
+ name: MTEB CQADupstackMathematicaRetrieval
444
+ config: default
445
+ split: test
446
+ metrics:
447
+ - type: map_at_1
448
+ value: 17.774
449
+ - type: map_at_10
450
+ value: 28.099
451
+ - type: map_at_100
452
+ value: 29.603
453
+ - type: map_at_1000
454
+ value: 29.709999999999997
455
+ - type: map_at_3
456
+ value: 25.036
457
+ - type: map_at_5
458
+ value: 26.657999999999998
459
+ - type: ndcg_at_1
460
+ value: 22.139
461
+ - type: ndcg_at_10
462
+ value: 34.205999999999996
463
+ - type: ndcg_at_100
464
+ value: 40.844
465
+ - type: ndcg_at_1000
466
+ value: 43.144
467
+ - type: ndcg_at_3
468
+ value: 28.732999999999997
469
+ - type: ndcg_at_5
470
+ value: 31.252000000000002
471
+ - type: precision_at_1
472
+ value: 22.139
473
+ - type: precision_at_10
474
+ value: 6.567
475
+ - type: precision_at_100
476
+ value: 1.147
477
+ - type: precision_at_1000
478
+ value: 0.146
479
+ - type: precision_at_3
480
+ value: 14.386
481
+ - type: precision_at_5
482
+ value: 10.423
483
+ - type: recall_at_1
484
+ value: 17.774
485
+ - type: recall_at_10
486
+ value: 48.32
487
+ - type: recall_at_100
488
+ value: 76.373
489
+ - type: recall_at_1000
490
+ value: 92.559
491
+ - type: recall_at_3
492
+ value: 33.478
493
+ - type: recall_at_5
494
+ value: 39.872
495
+ - task:
496
+ type: Retrieval
497
+ dataset:
498
+ type: BeIR/cqadupstack
499
+ name: MTEB CQADupstackPhysicsRetrieval
500
+ config: default
501
+ split: test
502
+ metrics:
503
+ - type: map_at_1
504
+ value: 31.885
505
+ - type: map_at_10
506
+ value: 44.289
507
+ - type: map_at_100
508
+ value: 45.757999999999996
509
+ - type: map_at_1000
510
+ value: 45.86
511
+ - type: map_at_3
512
+ value: 40.459
513
+ - type: map_at_5
514
+ value: 42.662
515
+ - type: ndcg_at_1
516
+ value: 39.75
517
+ - type: ndcg_at_10
518
+ value: 50.975
519
+ - type: ndcg_at_100
520
+ value: 56.528999999999996
521
+ - type: ndcg_at_1000
522
+ value: 58.06099999999999
523
+ - type: ndcg_at_3
524
+ value: 45.327
525
+ - type: ndcg_at_5
526
+ value: 48.041
527
+ - type: precision_at_1
528
+ value: 39.75
529
+ - type: precision_at_10
530
+ value: 9.557
531
+ - type: precision_at_100
532
+ value: 1.469
533
+ - type: precision_at_1000
534
+ value: 0.17700000000000002
535
+ - type: precision_at_3
536
+ value: 22.073
537
+ - type: precision_at_5
538
+ value: 15.765
539
+ - type: recall_at_1
540
+ value: 31.885
541
+ - type: recall_at_10
542
+ value: 64.649
543
+ - type: recall_at_100
544
+ value: 87.702
545
+ - type: recall_at_1000
546
+ value: 97.327
547
+ - type: recall_at_3
548
+ value: 48.61
549
+ - type: recall_at_5
550
+ value: 55.882
551
+ - task:
552
+ type: Retrieval
553
+ dataset:
554
+ type: BeIR/cqadupstack
555
+ name: MTEB CQADupstackProgrammersRetrieval
556
+ config: default
557
+ split: test
558
+ metrics:
559
+ - type: map_at_1
560
+ value: 26.454
561
+ - type: map_at_10
562
+ value: 37.756
563
+ - type: map_at_100
564
+ value: 39.225
565
+ - type: map_at_1000
566
+ value: 39.332
567
+ - type: map_at_3
568
+ value: 34.115
569
+ - type: map_at_5
570
+ value: 35.942
571
+ - type: ndcg_at_1
572
+ value: 32.42
573
+ - type: ndcg_at_10
574
+ value: 44.165
575
+ - type: ndcg_at_100
576
+ value: 50.202000000000005
577
+ - type: ndcg_at_1000
578
+ value: 52.188
579
+ - type: ndcg_at_3
580
+ value: 38.381
581
+ - type: ndcg_at_5
582
+ value: 40.849000000000004
583
+ - type: precision_at_1
584
+ value: 32.42
585
+ - type: precision_at_10
586
+ value: 8.482000000000001
587
+ - type: precision_at_100
588
+ value: 1.332
589
+ - type: precision_at_1000
590
+ value: 0.169
591
+ - type: precision_at_3
592
+ value: 18.683
593
+ - type: precision_at_5
594
+ value: 13.539000000000001
595
+ - type: recall_at_1
596
+ value: 26.454
597
+ - type: recall_at_10
598
+ value: 57.937000000000005
599
+ - type: recall_at_100
600
+ value: 83.76
601
+ - type: recall_at_1000
602
+ value: 96.82600000000001
603
+ - type: recall_at_3
604
+ value: 41.842
605
+ - type: recall_at_5
606
+ value: 48.285
607
+ - task:
608
+ type: Retrieval
609
+ dataset:
610
+ type: BeIR/cqadupstack
611
+ name: MTEB CQADupstackRetrieval
612
+ config: default
613
+ split: test
614
+ metrics:
615
+ - type: map_at_1
616
+ value: 27.743666666666666
617
+ - type: map_at_10
618
+ value: 38.75416666666667
619
+ - type: map_at_100
620
+ value: 40.133250000000004
621
+ - type: map_at_1000
622
+ value: 40.24616666666667
623
+ - type: map_at_3
624
+ value: 35.267250000000004
625
+ - type: map_at_5
626
+ value: 37.132749999999994
627
+ - type: ndcg_at_1
628
+ value: 33.14358333333333
629
+ - type: ndcg_at_10
630
+ value: 44.95916666666667
631
+ - type: ndcg_at_100
632
+ value: 50.46375
633
+ - type: ndcg_at_1000
634
+ value: 52.35508333333334
635
+ - type: ndcg_at_3
636
+ value: 39.17883333333334
637
+ - type: ndcg_at_5
638
+ value: 41.79724999999999
639
+ - type: precision_at_1
640
+ value: 33.14358333333333
641
+ - type: precision_at_10
642
+ value: 8.201083333333333
643
+ - type: precision_at_100
644
+ value: 1.3085
645
+ - type: precision_at_1000
646
+ value: 0.1665833333333333
647
+ - type: precision_at_3
648
+ value: 18.405583333333333
649
+ - type: precision_at_5
650
+ value: 13.233166666666666
651
+ - type: recall_at_1
652
+ value: 27.743666666666666
653
+ - type: recall_at_10
654
+ value: 58.91866666666667
655
+ - type: recall_at_100
656
+ value: 82.76216666666666
657
+ - type: recall_at_1000
658
+ value: 95.56883333333333
659
+ - type: recall_at_3
660
+ value: 42.86925
661
+ - type: recall_at_5
662
+ value: 49.553333333333335
663
+ - task:
664
+ type: Retrieval
665
+ dataset:
666
+ type: BeIR/cqadupstack
667
+ name: MTEB CQADupstackStatsRetrieval
668
+ config: default
669
+ split: test
670
+ metrics:
671
+ - type: map_at_1
672
+ value: 25.244
673
+ - type: map_at_10
674
+ value: 33.464
675
+ - type: map_at_100
676
+ value: 34.633
677
+ - type: map_at_1000
678
+ value: 34.721999999999994
679
+ - type: map_at_3
680
+ value: 30.784
681
+ - type: map_at_5
682
+ value: 32.183
683
+ - type: ndcg_at_1
684
+ value: 28.681
685
+ - type: ndcg_at_10
686
+ value: 38.149
687
+ - type: ndcg_at_100
688
+ value: 43.856
689
+ - type: ndcg_at_1000
690
+ value: 46.026
691
+ - type: ndcg_at_3
692
+ value: 33.318
693
+ - type: ndcg_at_5
694
+ value: 35.454
695
+ - type: precision_at_1
696
+ value: 28.681
697
+ - type: precision_at_10
698
+ value: 6.304
699
+ - type: precision_at_100
700
+ value: 0.992
701
+ - type: precision_at_1000
702
+ value: 0.125
703
+ - type: precision_at_3
704
+ value: 14.673
705
+ - type: precision_at_5
706
+ value: 10.245
707
+ - type: recall_at_1
708
+ value: 25.244
709
+ - type: recall_at_10
710
+ value: 49.711
711
+ - type: recall_at_100
712
+ value: 75.928
713
+ - type: recall_at_1000
714
+ value: 91.79899999999999
715
+ - type: recall_at_3
716
+ value: 36.325
717
+ - type: recall_at_5
718
+ value: 41.752
719
+ - task:
720
+ type: Retrieval
721
+ dataset:
722
+ type: BeIR/cqadupstack
723
+ name: MTEB CQADupstackTexRetrieval
724
+ config: default
725
+ split: test
726
+ metrics:
727
+ - type: map_at_1
728
+ value: 18.857
729
+ - type: map_at_10
730
+ value: 27.794
731
+ - type: map_at_100
732
+ value: 29.186
733
+ - type: map_at_1000
734
+ value: 29.323
735
+ - type: map_at_3
736
+ value: 24.779
737
+ - type: map_at_5
738
+ value: 26.459
739
+ - type: ndcg_at_1
740
+ value: 23.227999999999998
741
+ - type: ndcg_at_10
742
+ value: 33.353
743
+ - type: ndcg_at_100
744
+ value: 39.598
745
+ - type: ndcg_at_1000
746
+ value: 42.268
747
+ - type: ndcg_at_3
748
+ value: 28.054000000000002
749
+ - type: ndcg_at_5
750
+ value: 30.566
751
+ - type: precision_at_1
752
+ value: 23.227999999999998
753
+ - type: precision_at_10
754
+ value: 6.397
755
+ - type: precision_at_100
756
+ value: 1.129
757
+ - type: precision_at_1000
758
+ value: 0.155
759
+ - type: precision_at_3
760
+ value: 13.616
761
+ - type: precision_at_5
762
+ value: 10.116999999999999
763
+ - type: recall_at_1
764
+ value: 18.857
765
+ - type: recall_at_10
766
+ value: 45.797
767
+ - type: recall_at_100
768
+ value: 73.615
769
+ - type: recall_at_1000
770
+ value: 91.959
771
+ - type: recall_at_3
772
+ value: 31.129
773
+ - type: recall_at_5
774
+ value: 37.565
775
+ - task:
776
+ type: Retrieval
777
+ dataset:
778
+ type: BeIR/cqadupstack
779
+ name: MTEB CQADupstackUnixRetrieval
780
+ config: default
781
+ split: test
782
+ metrics:
783
+ - type: map_at_1
784
+ value: 27.486
785
+ - type: map_at_10
786
+ value: 39.164
787
+ - type: map_at_100
788
+ value: 40.543
789
+ - type: map_at_1000
790
+ value: 40.636
791
+ - type: map_at_3
792
+ value: 35.52
793
+ - type: map_at_5
794
+ value: 37.355
795
+ - type: ndcg_at_1
796
+ value: 32.275999999999996
797
+ - type: ndcg_at_10
798
+ value: 45.414
799
+ - type: ndcg_at_100
800
+ value: 51.254
801
+ - type: ndcg_at_1000
802
+ value: 53.044000000000004
803
+ - type: ndcg_at_3
804
+ value: 39.324999999999996
805
+ - type: ndcg_at_5
806
+ value: 41.835
807
+ - type: precision_at_1
808
+ value: 32.275999999999996
809
+ - type: precision_at_10
810
+ value: 8.144
811
+ - type: precision_at_100
812
+ value: 1.237
813
+ - type: precision_at_1000
814
+ value: 0.15
815
+ - type: precision_at_3
816
+ value: 18.501
817
+ - type: precision_at_5
818
+ value: 13.134
819
+ - type: recall_at_1
820
+ value: 27.486
821
+ - type: recall_at_10
822
+ value: 60.449
823
+ - type: recall_at_100
824
+ value: 85.176
825
+ - type: recall_at_1000
826
+ value: 97.087
827
+ - type: recall_at_3
828
+ value: 43.59
829
+ - type: recall_at_5
830
+ value: 50.08899999999999
831
+ - task:
832
+ type: Retrieval
833
+ dataset:
834
+ type: BeIR/cqadupstack
835
+ name: MTEB CQADupstackWebmastersRetrieval
836
+ config: default
837
+ split: test
838
+ metrics:
839
+ - type: map_at_1
840
+ value: 26.207
841
+ - type: map_at_10
842
+ value: 37.255
843
+ - type: map_at_100
844
+ value: 39.043
845
+ - type: map_at_1000
846
+ value: 39.273
847
+ - type: map_at_3
848
+ value: 33.487
849
+ - type: map_at_5
850
+ value: 35.441
851
+ - type: ndcg_at_1
852
+ value: 31.423000000000002
853
+ - type: ndcg_at_10
854
+ value: 44.235
855
+ - type: ndcg_at_100
856
+ value: 50.49
857
+ - type: ndcg_at_1000
858
+ value: 52.283
859
+ - type: ndcg_at_3
860
+ value: 37.602000000000004
861
+ - type: ndcg_at_5
862
+ value: 40.518
863
+ - type: precision_at_1
864
+ value: 31.423000000000002
865
+ - type: precision_at_10
866
+ value: 8.715
867
+ - type: precision_at_100
868
+ value: 1.7590000000000001
869
+ - type: precision_at_1000
870
+ value: 0.257
871
+ - type: precision_at_3
872
+ value: 17.523
873
+ - type: precision_at_5
874
+ value: 13.161999999999999
875
+ - type: recall_at_1
876
+ value: 26.207
877
+ - type: recall_at_10
878
+ value: 59.17099999999999
879
+ - type: recall_at_100
880
+ value: 86.166
881
+ - type: recall_at_1000
882
+ value: 96.54700000000001
883
+ - type: recall_at_3
884
+ value: 41.18
885
+ - type: recall_at_5
886
+ value: 48.083999999999996
887
+ - task:
888
+ type: Retrieval
889
+ dataset:
890
+ type: BeIR/cqadupstack
891
+ name: MTEB CQADupstackWordpressRetrieval
892
+ config: default
893
+ split: test
894
+ metrics:
895
+ - type: map_at_1
896
+ value: 20.342
897
+ - type: map_at_10
898
+ value: 29.962
899
+ - type: map_at_100
900
+ value: 30.989
901
+ - type: map_at_1000
902
+ value: 31.102999999999998
903
+ - type: map_at_3
904
+ value: 26.656000000000002
905
+ - type: map_at_5
906
+ value: 28.179
907
+ - type: ndcg_at_1
908
+ value: 22.551
909
+ - type: ndcg_at_10
910
+ value: 35.945
911
+ - type: ndcg_at_100
912
+ value: 41.012
913
+ - type: ndcg_at_1000
914
+ value: 43.641999999999996
915
+ - type: ndcg_at_3
916
+ value: 29.45
917
+ - type: ndcg_at_5
918
+ value: 31.913999999999998
919
+ - type: precision_at_1
920
+ value: 22.551
921
+ - type: precision_at_10
922
+ value: 6.1
923
+ - type: precision_at_100
924
+ value: 0.943
925
+ - type: precision_at_1000
926
+ value: 0.129
927
+ - type: precision_at_3
928
+ value: 13.184999999999999
929
+ - type: precision_at_5
930
+ value: 9.353
931
+ - type: recall_at_1
932
+ value: 20.342
933
+ - type: recall_at_10
934
+ value: 52.349000000000004
935
+ - type: recall_at_100
936
+ value: 75.728
937
+ - type: recall_at_1000
938
+ value: 95.253
939
+ - type: recall_at_3
940
+ value: 34.427
941
+ - type: recall_at_5
942
+ value: 40.326
943
+ - task:
944
+ type: Retrieval
945
+ dataset:
946
+ type: climate-fever
947
+ name: MTEB ClimateFEVER
948
+ config: default
949
+ split: test
950
+ metrics:
951
+ - type: map_at_1
952
+ value: 7.71
953
+ - type: map_at_10
954
+ value: 14.81
955
+ - type: map_at_100
956
+ value: 16.536
957
+ - type: map_at_1000
958
+ value: 16.744999999999997
959
+ - type: map_at_3
960
+ value: 12.109
961
+ - type: map_at_5
962
+ value: 13.613
963
+ - type: ndcg_at_1
964
+ value: 18.046
965
+ - type: ndcg_at_10
966
+ value: 21.971
967
+ - type: ndcg_at_100
968
+ value: 29.468
969
+ - type: ndcg_at_1000
970
+ value: 33.428999999999995
971
+ - type: ndcg_at_3
972
+ value: 17.227999999999998
973
+ - type: ndcg_at_5
974
+ value: 19.189999999999998
975
+ - type: precision_at_1
976
+ value: 18.046
977
+ - type: precision_at_10
978
+ value: 7.192
979
+ - type: precision_at_100
980
+ value: 1.51
981
+ - type: precision_at_1000
982
+ value: 0.22499999999999998
983
+ - type: precision_at_3
984
+ value: 13.312
985
+ - type: precision_at_5
986
+ value: 10.775
987
+ - type: recall_at_1
988
+ value: 7.71
989
+ - type: recall_at_10
990
+ value: 27.908
991
+ - type: recall_at_100
992
+ value: 54.452
993
+ - type: recall_at_1000
994
+ value: 76.764
995
+ - type: recall_at_3
996
+ value: 16.64
997
+ - type: recall_at_5
998
+ value: 21.631
999
+ - task:
1000
+ type: Retrieval
1001
+ dataset:
1002
+ type: dbpedia-entity
1003
+ name: MTEB DBPedia
1004
+ config: default
1005
+ split: test
1006
+ metrics:
1007
+ - type: map_at_1
1008
+ value: 6.8180000000000005
1009
+ - type: map_at_10
1010
+ value: 14.591000000000001
1011
+ - type: map_at_100
1012
+ value: 19.855999999999998
1013
+ - type: map_at_1000
1014
+ value: 21.178
1015
+ - type: map_at_3
1016
+ value: 10.345
1017
+ - type: map_at_5
1018
+ value: 12.367
1019
+ - type: ndcg_at_1
1020
+ value: 39.25
1021
+ - type: ndcg_at_10
1022
+ value: 32.088
1023
+ - type: ndcg_at_100
1024
+ value: 36.019
1025
+ - type: ndcg_at_1000
1026
+ value: 43.649
1027
+ - type: ndcg_at_3
1028
+ value: 35.132999999999996
1029
+ - type: ndcg_at_5
1030
+ value: 33.777
1031
+ - type: precision_at_1
1032
+ value: 49.5
1033
+ - type: precision_at_10
1034
+ value: 25.624999999999996
1035
+ - type: precision_at_100
1036
+ value: 8.043
1037
+ - type: precision_at_1000
1038
+ value: 1.7409999999999999
1039
+ - type: precision_at_3
1040
+ value: 38.417
1041
+ - type: precision_at_5
1042
+ value: 33.2
1043
+ - type: recall_at_1
1044
+ value: 6.8180000000000005
1045
+ - type: recall_at_10
1046
+ value: 20.399
1047
+ - type: recall_at_100
1048
+ value: 42.8
1049
+ - type: recall_at_1000
1050
+ value: 68.081
1051
+ - type: recall_at_3
1052
+ value: 11.928999999999998
1053
+ - type: recall_at_5
1054
+ value: 15.348999999999998
1055
+ - task:
1056
+ type: Classification
1057
+ dataset:
1058
+ type: mteb/emotion
1059
+ name: MTEB EmotionClassification
1060
+ config: default
1061
+ split: test
1062
+ metrics:
1063
+ - type: accuracy
1064
+ value: 39.725
1065
+ - type: f1
1066
+ value: 35.19385687310605
1067
+ - task:
1068
+ type: Retrieval
1069
+ dataset:
1070
+ type: fever
1071
+ name: MTEB FEVER
1072
+ config: default
1073
+ split: test
1074
+ metrics:
1075
+ - type: map_at_1
1076
+ value: 31.901000000000003
1077
+ - type: map_at_10
1078
+ value: 44.156
1079
+ - type: map_at_100
1080
+ value: 44.901
1081
+ - type: map_at_1000
1082
+ value: 44.939
1083
+ - type: map_at_3
1084
+ value: 41.008
1085
+ - type: map_at_5
1086
+ value: 42.969
1087
+ - type: ndcg_at_1
1088
+ value: 34.263
1089
+ - type: ndcg_at_10
1090
+ value: 50.863
1091
+ - type: ndcg_at_100
1092
+ value: 54.336
1093
+ - type: ndcg_at_1000
1094
+ value: 55.297
1095
+ - type: ndcg_at_3
1096
+ value: 44.644
1097
+ - type: ndcg_at_5
1098
+ value: 48.075
1099
+ - type: precision_at_1
1100
+ value: 34.263
1101
+ - type: precision_at_10
1102
+ value: 7.542999999999999
1103
+ - type: precision_at_100
1104
+ value: 0.9400000000000001
1105
+ - type: precision_at_1000
1106
+ value: 0.104
1107
+ - type: precision_at_3
1108
+ value: 18.912000000000003
1109
+ - type: precision_at_5
1110
+ value: 13.177
1111
+ - type: recall_at_1
1112
+ value: 31.901000000000003
1113
+ - type: recall_at_10
1114
+ value: 68.872
1115
+ - type: recall_at_100
1116
+ value: 84.468
1117
+ - type: recall_at_1000
1118
+ value: 91.694
1119
+ - type: recall_at_3
1120
+ value: 52.272
1121
+ - type: recall_at_5
1122
+ value: 60.504999999999995
1123
+ - task:
1124
+ type: Retrieval
1125
+ dataset:
1126
+ type: fiqa
1127
+ name: MTEB FiQA2018
1128
+ config: default
1129
+ split: test
1130
+ metrics:
1131
+ - type: map_at_1
1132
+ value: 24.4
1133
+ - type: map_at_10
1134
+ value: 41.117
1135
+ - type: map_at_100
1136
+ value: 43.167
1137
+ - type: map_at_1000
1138
+ value: 43.323
1139
+ - type: map_at_3
1140
+ value: 35.744
1141
+ - type: map_at_5
1142
+ value: 38.708
1143
+ - type: ndcg_at_1
1144
+ value: 49.074
1145
+ - type: ndcg_at_10
1146
+ value: 49.963
1147
+ - type: ndcg_at_100
1148
+ value: 56.564
1149
+ - type: ndcg_at_1000
1150
+ value: 58.931999999999995
1151
+ - type: ndcg_at_3
1152
+ value: 45.489000000000004
1153
+ - type: ndcg_at_5
1154
+ value: 47.133
1155
+ - type: precision_at_1
1156
+ value: 49.074
1157
+ - type: precision_at_10
1158
+ value: 13.889000000000001
1159
+ - type: precision_at_100
1160
+ value: 2.091
1161
+ - type: precision_at_1000
1162
+ value: 0.251
1163
+ - type: precision_at_3
1164
+ value: 30.658
1165
+ - type: precision_at_5
1166
+ value: 22.593
1167
+ - type: recall_at_1
1168
+ value: 24.4
1169
+ - type: recall_at_10
1170
+ value: 58.111999999999995
1171
+ - type: recall_at_100
1172
+ value: 81.96900000000001
1173
+ - type: recall_at_1000
1174
+ value: 96.187
1175
+ - type: recall_at_3
1176
+ value: 41.661
1177
+ - type: recall_at_5
1178
+ value: 49.24
1179
+ - task:
1180
+ type: Retrieval
1181
+ dataset:
1182
+ type: hotpotqa
1183
+ name: MTEB HotpotQA
1184
+ config: default
1185
+ split: test
1186
+ metrics:
1187
+ - type: map_at_1
1188
+ value: 22.262
1189
+ - type: map_at_10
1190
+ value: 31.266
1191
+ - type: map_at_100
1192
+ value: 32.202
1193
+ - type: map_at_1000
1194
+ value: 32.300000000000004
1195
+ - type: map_at_3
1196
+ value: 28.874
1197
+ - type: map_at_5
1198
+ value: 30.246000000000002
1199
+ - type: ndcg_at_1
1200
+ value: 44.524
1201
+ - type: ndcg_at_10
1202
+ value: 39.294000000000004
1203
+ - type: ndcg_at_100
1204
+ value: 43.296
1205
+ - type: ndcg_at_1000
1206
+ value: 45.561
1207
+ - type: ndcg_at_3
1208
+ value: 35.013
1209
+ - type: ndcg_at_5
1210
+ value: 37.177
1211
+ - type: precision_at_1
1212
+ value: 44.524
1213
+ - type: precision_at_10
1214
+ value: 8.52
1215
+ - type: precision_at_100
1216
+ value: 1.169
1217
+ - type: precision_at_1000
1218
+ value: 0.147
1219
+ - type: precision_at_3
1220
+ value: 22.003
1221
+ - type: precision_at_5
1222
+ value: 14.914
1223
+ - type: recall_at_1
1224
+ value: 22.262
1225
+ - type: recall_at_10
1226
+ value: 42.6
1227
+ - type: recall_at_100
1228
+ value: 58.46
1229
+ - type: recall_at_1000
1230
+ value: 73.565
1231
+ - type: recall_at_3
1232
+ value: 33.005
1233
+ - type: recall_at_5
1234
+ value: 37.286
1235
+ - task:
1236
+ type: Classification
1237
+ dataset:
1238
+ type: mteb/imdb
1239
+ name: MTEB ImdbClassification
1240
+ config: default
1241
+ split: test
1242
+ metrics:
1243
+ - type: accuracy
1244
+ value: 70.7156
1245
+ - type: ap
1246
+ value: 64.89470531959896
1247
+ - type: f1
1248
+ value: 70.53051887683772
1249
+ - task:
1250
+ type: Retrieval
1251
+ dataset:
1252
+ type: msmarco
1253
+ name: MTEB MSMARCO
1254
+ config: default
1255
+ split: dev
1256
+ metrics:
1257
+ - type: map_at_1
1258
+ value: 21.174
1259
+ - type: map_at_10
1260
+ value: 33.0
1261
+ - type: map_at_100
1262
+ value: 34.178
1263
+ - type: map_at_1000
1264
+ value: 34.227000000000004
1265
+ - type: map_at_3
1266
+ value: 29.275000000000002
1267
+ - type: map_at_5
1268
+ value: 31.341
1269
+ - type: ndcg_at_1
1270
+ value: 21.776999999999997
1271
+ - type: ndcg_at_10
1272
+ value: 39.745999999999995
1273
+ - type: ndcg_at_100
1274
+ value: 45.488
1275
+ - type: ndcg_at_1000
1276
+ value: 46.733999999999995
1277
+ - type: ndcg_at_3
1278
+ value: 32.086
1279
+ - type: ndcg_at_5
1280
+ value: 35.792
1281
+ - type: precision_at_1
1282
+ value: 21.776999999999997
1283
+ - type: precision_at_10
1284
+ value: 6.324000000000001
1285
+ - type: precision_at_100
1286
+ value: 0.922
1287
+ - type: precision_at_1000
1288
+ value: 0.10300000000000001
1289
+ - type: precision_at_3
1290
+ value: 13.696
1291
+ - type: precision_at_5
1292
+ value: 10.100000000000001
1293
+ - type: recall_at_1
1294
+ value: 21.174
1295
+ - type: recall_at_10
1296
+ value: 60.488
1297
+ - type: recall_at_100
1298
+ value: 87.234
1299
+ - type: recall_at_1000
1300
+ value: 96.806
1301
+ - type: recall_at_3
1302
+ value: 39.582
1303
+ - type: recall_at_5
1304
+ value: 48.474000000000004
1305
+ - task:
1306
+ type: Classification
1307
+ dataset:
1308
+ type: mteb/mtop_domain
1309
+ name: MTEB MTOPDomainClassification (en)
1310
+ config: en
1311
+ split: test
1312
+ metrics:
1313
+ - type: accuracy
1314
+ value: 92.07934336525308
1315
+ - type: f1
1316
+ value: 91.93440027035814
1317
+ - task:
1318
+ type: Classification
1319
+ dataset:
1320
+ type: mteb/mtop_intent
1321
+ name: MTEB MTOPIntentClassification (en)
1322
+ config: en
1323
+ split: test
1324
+ metrics:
1325
+ - type: accuracy
1326
+ value: 70.20975832193344
1327
+ - type: f1
1328
+ value: 48.571776628850074
1329
+ - task:
1330
+ type: Classification
1331
+ dataset:
1332
+ type: mteb/amazon_massive_intent
1333
+ name: MTEB MassiveIntentClassification (en)
1334
+ config: en
1335
+ split: test
1336
+ metrics:
1337
+ - type: accuracy
1338
+ value: 69.56624075319435
1339
+ - type: f1
1340
+ value: 67.64419185784621
1341
+ - task:
1342
+ type: Classification
1343
+ dataset:
1344
+ type: mteb/amazon_massive_scenario
1345
+ name: MTEB MassiveScenarioClassification (en)
1346
+ config: en
1347
+ split: test
1348
+ metrics:
1349
+ - type: accuracy
1350
+ value: 76.01210490921318
1351
+ - type: f1
1352
+ value: 75.1934366365826
1353
+ - task:
1354
+ type: Clustering
1355
+ dataset:
1356
+ type: mteb/medrxiv-clustering-p2p
1357
+ name: MTEB MedrxivClusteringP2P
1358
+ config: default
1359
+ split: test
1360
+ metrics:
1361
+ - type: v_measure
1362
+ value: 35.58002813186373
1363
+ - task:
1364
+ type: Clustering
1365
+ dataset:
1366
+ type: mteb/medrxiv-clustering-s2s
1367
+ name: MTEB MedrxivClusteringS2S
1368
+ config: default
1369
+ split: test
1370
+ metrics:
1371
+ - type: v_measure
1372
+ value: 32.872725562410444
1373
+ - task:
1374
+ type: Reranking
1375
+ dataset:
1376
+ type: mteb/mind_small
1377
+ name: MTEB MindSmallReranking
1378
+ config: default
1379
+ split: test
1380
+ metrics:
1381
+ - type: map
1382
+ value: 30.965343604861328
1383
+ - type: mrr
1384
+ value: 31.933710165863594
1385
+ - task:
1386
+ type: Retrieval
1387
+ dataset:
1388
+ type: nfcorpus
1389
+ name: MTEB NFCorpus
1390
+ config: default
1391
+ split: test
1392
+ metrics:
1393
+ - type: map_at_1
1394
+ value: 4.938
1395
+ - type: map_at_10
1396
+ value: 12.034
1397
+ - type: map_at_100
1398
+ value: 15.675
1399
+ - type: map_at_1000
1400
+ value: 17.18
1401
+ - type: map_at_3
1402
+ value: 8.471
1403
+ - type: map_at_5
1404
+ value: 10.128
1405
+ - type: ndcg_at_1
1406
+ value: 40.402
1407
+ - type: ndcg_at_10
1408
+ value: 33.289
1409
+ - type: ndcg_at_100
1410
+ value: 31.496000000000002
1411
+ - type: ndcg_at_1000
1412
+ value: 40.453
1413
+ - type: ndcg_at_3
1414
+ value: 37.841
1415
+ - type: ndcg_at_5
1416
+ value: 36.215
1417
+ - type: precision_at_1
1418
+ value: 41.796
1419
+ - type: precision_at_10
1420
+ value: 25.294
1421
+ - type: precision_at_100
1422
+ value: 8.381
1423
+ - type: precision_at_1000
1424
+ value: 2.1260000000000003
1425
+ - type: precision_at_3
1426
+ value: 36.429
1427
+ - type: precision_at_5
1428
+ value: 32.446000000000005
1429
+ - type: recall_at_1
1430
+ value: 4.938
1431
+ - type: recall_at_10
1432
+ value: 16.637
1433
+ - type: recall_at_100
1434
+ value: 33.853
1435
+ - type: recall_at_1000
1436
+ value: 66.07
1437
+ - type: recall_at_3
1438
+ value: 9.818
1439
+ - type: recall_at_5
1440
+ value: 12.544
1441
+ - task:
1442
+ type: Retrieval
1443
+ dataset:
1444
+ type: nq
1445
+ name: MTEB NQ
1446
+ config: default
1447
+ split: test
1448
+ metrics:
1449
+ - type: map_at_1
1450
+ value: 27.124
1451
+ - type: map_at_10
1452
+ value: 42.418
1453
+ - type: map_at_100
1454
+ value: 43.633
1455
+ - type: map_at_1000
1456
+ value: 43.66
1457
+ - type: map_at_3
1458
+ value: 37.766
1459
+ - type: map_at_5
1460
+ value: 40.482
1461
+ - type: ndcg_at_1
1462
+ value: 30.794
1463
+ - type: ndcg_at_10
1464
+ value: 50.449999999999996
1465
+ - type: ndcg_at_100
1466
+ value: 55.437999999999995
1467
+ - type: ndcg_at_1000
1468
+ value: 56.084
1469
+ - type: ndcg_at_3
1470
+ value: 41.678
1471
+ - type: ndcg_at_5
1472
+ value: 46.257
1473
+ - type: precision_at_1
1474
+ value: 30.794
1475
+ - type: precision_at_10
1476
+ value: 8.656
1477
+ - type: precision_at_100
1478
+ value: 1.141
1479
+ - type: precision_at_1000
1480
+ value: 0.12
1481
+ - type: precision_at_3
1482
+ value: 19.37
1483
+ - type: precision_at_5
1484
+ value: 14.218
1485
+ - type: recall_at_1
1486
+ value: 27.124
1487
+ - type: recall_at_10
1488
+ value: 72.545
1489
+ - type: recall_at_100
1490
+ value: 93.938
1491
+ - type: recall_at_1000
1492
+ value: 98.788
1493
+ - type: recall_at_3
1494
+ value: 49.802
1495
+ - type: recall_at_5
1496
+ value: 60.426
1497
+ - task:
1498
+ type: Retrieval
1499
+ dataset:
1500
+ type: quora
1501
+ name: MTEB QuoraRetrieval
1502
+ config: default
1503
+ split: test
1504
+ metrics:
1505
+ - type: map_at_1
1506
+ value: 69.33500000000001
1507
+ - type: map_at_10
1508
+ value: 83.554
1509
+ - type: map_at_100
1510
+ value: 84.237
1511
+ - type: map_at_1000
1512
+ value: 84.251
1513
+ - type: map_at_3
1514
+ value: 80.456
1515
+ - type: map_at_5
1516
+ value: 82.395
1517
+ - type: ndcg_at_1
1518
+ value: 80.06
1519
+ - type: ndcg_at_10
1520
+ value: 87.46199999999999
1521
+ - type: ndcg_at_100
1522
+ value: 88.774
1523
+ - type: ndcg_at_1000
1524
+ value: 88.864
1525
+ - type: ndcg_at_3
1526
+ value: 84.437
1527
+ - type: ndcg_at_5
1528
+ value: 86.129
1529
+ - type: precision_at_1
1530
+ value: 80.06
1531
+ - type: precision_at_10
1532
+ value: 13.418
1533
+ - type: precision_at_100
1534
+ value: 1.536
1535
+ - type: precision_at_1000
1536
+ value: 0.157
1537
+ - type: precision_at_3
1538
+ value: 37.103
1539
+ - type: precision_at_5
1540
+ value: 24.522
1541
+ - type: recall_at_1
1542
+ value: 69.33500000000001
1543
+ - type: recall_at_10
1544
+ value: 95.03200000000001
1545
+ - type: recall_at_100
1546
+ value: 99.559
1547
+ - type: recall_at_1000
1548
+ value: 99.98700000000001
1549
+ - type: recall_at_3
1550
+ value: 86.404
1551
+ - type: recall_at_5
1552
+ value: 91.12400000000001
1553
+ - task:
1554
+ type: Clustering
1555
+ dataset:
1556
+ type: mteb/reddit-clustering
1557
+ name: MTEB RedditClustering
1558
+ config: default
1559
+ split: test
1560
+ metrics:
1561
+ - type: v_measure
1562
+ value: 54.824256698437324
1563
+ - task:
1564
+ type: Clustering
1565
+ dataset:
1566
+ type: mteb/reddit-clustering-p2p
1567
+ name: MTEB RedditClusteringP2P
1568
+ config: default
1569
+ split: test
1570
+ metrics:
1571
+ - type: v_measure
1572
+ value: 56.768972678049366
1573
+ - task:
1574
+ type: Retrieval
1575
+ dataset:
1576
+ type: scidocs
1577
+ name: MTEB SCIDOCS
1578
+ config: default
1579
+ split: test
1580
+ metrics:
1581
+ - type: map_at_1
1582
+ value: 5.192
1583
+ - type: map_at_10
1584
+ value: 14.426
1585
+ - type: map_at_100
1586
+ value: 17.18
1587
+ - type: map_at_1000
1588
+ value: 17.580000000000002
1589
+ - type: map_at_3
1590
+ value: 9.94
1591
+ - type: map_at_5
1592
+ value: 12.077
1593
+ - type: ndcg_at_1
1594
+ value: 25.5
1595
+ - type: ndcg_at_10
1596
+ value: 23.765
1597
+ - type: ndcg_at_100
1598
+ value: 33.664
1599
+ - type: ndcg_at_1000
1600
+ value: 39.481
1601
+ - type: ndcg_at_3
1602
+ value: 21.813
1603
+ - type: ndcg_at_5
1604
+ value: 19.285
1605
+ - type: precision_at_1
1606
+ value: 25.5
1607
+ - type: precision_at_10
1608
+ value: 12.690000000000001
1609
+ - type: precision_at_100
1610
+ value: 2.71
1611
+ - type: precision_at_1000
1612
+ value: 0.409
1613
+ - type: precision_at_3
1614
+ value: 20.732999999999997
1615
+ - type: precision_at_5
1616
+ value: 17.24
1617
+ - type: recall_at_1
1618
+ value: 5.192
1619
+ - type: recall_at_10
1620
+ value: 25.712000000000003
1621
+ - type: recall_at_100
1622
+ value: 54.99699999999999
1623
+ - type: recall_at_1000
1624
+ value: 82.97200000000001
1625
+ - type: recall_at_3
1626
+ value: 12.631999999999998
1627
+ - type: recall_at_5
1628
+ value: 17.497
1629
+ - task:
1630
+ type: STS
1631
+ dataset:
1632
+ type: mteb/sickr-sts
1633
+ name: MTEB SICK-R
1634
+ config: default
1635
+ split: test
1636
+ metrics:
1637
+ - type: cos_sim_pearson
1638
+ value: 84.00280838354293
1639
+ - type: cos_sim_spearman
1640
+ value: 80.5854192844009
1641
+ - type: euclidean_pearson
1642
+ value: 80.55974827073891
1643
+ - type: euclidean_spearman
1644
+ value: 80.58541460172292
1645
+ - type: manhattan_pearson
1646
+ value: 80.27294578437488
1647
+ - type: manhattan_spearman
1648
+ value: 80.33176193921884
1649
+ - task:
1650
+ type: STS
1651
+ dataset:
1652
+ type: mteb/sts12-sts
1653
+ name: MTEB STS12
1654
+ config: default
1655
+ split: test
1656
+ metrics:
1657
+ - type: cos_sim_pearson
1658
+ value: 83.2801353818369
1659
+ - type: cos_sim_spearman
1660
+ value: 72.63427853822449
1661
+ - type: euclidean_pearson
1662
+ value: 79.01343235899544
1663
+ - type: euclidean_spearman
1664
+ value: 72.63178302036903
1665
+ - type: manhattan_pearson
1666
+ value: 78.65899981586094
1667
+ - type: manhattan_spearman
1668
+ value: 72.26646573268035
1669
+ - task:
1670
+ type: STS
1671
+ dataset:
1672
+ type: mteb/sts13-sts
1673
+ name: MTEB STS13
1674
+ config: default
1675
+ split: test
1676
+ metrics:
1677
+ - type: cos_sim_pearson
1678
+ value: 83.20700572036095
1679
+ - type: cos_sim_spearman
1680
+ value: 83.48499016384896
1681
+ - type: euclidean_pearson
1682
+ value: 82.82555353364394
1683
+ - type: euclidean_spearman
1684
+ value: 83.48499008964005
1685
+ - type: manhattan_pearson
1686
+ value: 82.46034885462956
1687
+ - type: manhattan_spearman
1688
+ value: 83.09829447251937
1689
+ - task:
1690
+ type: STS
1691
+ dataset:
1692
+ type: mteb/sts14-sts
1693
+ name: MTEB STS14
1694
+ config: default
1695
+ split: test
1696
+ metrics:
1697
+ - type: cos_sim_pearson
1698
+ value: 82.27113025749529
1699
+ - type: cos_sim_spearman
1700
+ value: 78.0001371342168
1701
+ - type: euclidean_pearson
1702
+ value: 80.62651938409732
1703
+ - type: euclidean_spearman
1704
+ value: 78.0001341029446
1705
+ - type: manhattan_pearson
1706
+ value: 80.25786381999085
1707
+ - type: manhattan_spearman
1708
+ value: 77.68750207429126
1709
+ - task:
1710
+ type: STS
1711
+ dataset:
1712
+ type: mteb/sts15-sts
1713
+ name: MTEB STS15
1714
+ config: default
1715
+ split: test
1716
+ metrics:
1717
+ - type: cos_sim_pearson
1718
+ value: 84.98824030948605
1719
+ - type: cos_sim_spearman
1720
+ value: 85.66275391649481
1721
+ - type: euclidean_pearson
1722
+ value: 84.88733530073506
1723
+ - type: euclidean_spearman
1724
+ value: 85.66275062257034
1725
+ - type: manhattan_pearson
1726
+ value: 84.70100813924223
1727
+ - type: manhattan_spearman
1728
+ value: 85.50318526944764
1729
+ - task:
1730
+ type: STS
1731
+ dataset:
1732
+ type: mteb/sts16-sts
1733
+ name: MTEB STS16
1734
+ config: default
1735
+ split: test
1736
+ metrics:
1737
+ - type: cos_sim_pearson
1738
+ value: 78.82478639193744
1739
+ - type: cos_sim_spearman
1740
+ value: 80.03011315645662
1741
+ - type: euclidean_pearson
1742
+ value: 79.84794502236802
1743
+ - type: euclidean_spearman
1744
+ value: 80.03011258077692
1745
+ - type: manhattan_pearson
1746
+ value: 79.47012152325492
1747
+ - type: manhattan_spearman
1748
+ value: 79.60652985087651
1749
+ - task:
1750
+ type: STS
1751
+ dataset:
1752
+ type: mteb/sts17-crosslingual-sts
1753
+ name: MTEB STS17 (en-en)
1754
+ config: en-en
1755
+ split: test
1756
+ metrics:
1757
+ - type: cos_sim_pearson
1758
+ value: 90.90804154377126
1759
+ - type: cos_sim_spearman
1760
+ value: 90.59523263123734
1761
+ - type: euclidean_pearson
1762
+ value: 89.8466957775513
1763
+ - type: euclidean_spearman
1764
+ value: 90.59523263123734
1765
+ - type: manhattan_pearson
1766
+ value: 89.82268413033941
1767
+ - type: manhattan_spearman
1768
+ value: 90.68706496728889
1769
+ - task:
1770
+ type: STS
1771
+ dataset:
1772
+ type: mteb/sts22-crosslingual-sts
1773
+ name: MTEB STS22 (en)
1774
+ config: en
1775
+ split: test
1776
+ metrics:
1777
+ - type: cos_sim_pearson
1778
+ value: 66.78771571400975
1779
+ - type: cos_sim_spearman
1780
+ value: 67.94534221542501
1781
+ - type: euclidean_pearson
1782
+ value: 68.62534447097993
1783
+ - type: euclidean_spearman
1784
+ value: 67.94534221542501
1785
+ - type: manhattan_pearson
1786
+ value: 68.35916011329631
1787
+ - type: manhattan_spearman
1788
+ value: 67.58212723406085
1789
+ - task:
1790
+ type: STS
1791
+ dataset:
1792
+ type: mteb/stsbenchmark-sts
1793
+ name: MTEB STSBenchmark
1794
+ config: default
1795
+ split: test
1796
+ metrics:
1797
+ - type: cos_sim_pearson
1798
+ value: 84.03996099800993
1799
+ - type: cos_sim_spearman
1800
+ value: 83.421898505618
1801
+ - type: euclidean_pearson
1802
+ value: 83.78671249317563
1803
+ - type: euclidean_spearman
1804
+ value: 83.4219042133061
1805
+ - type: manhattan_pearson
1806
+ value: 83.44085827249334
1807
+ - type: manhattan_spearman
1808
+ value: 83.02901331535297
1809
+ - task:
1810
+ type: Reranking
1811
+ dataset:
1812
+ type: mteb/scidocs-reranking
1813
+ name: MTEB SciDocsRR
1814
+ config: default
1815
+ split: test
1816
+ metrics:
1817
+ - type: map
1818
+ value: 88.65396986895777
1819
+ - type: mrr
1820
+ value: 96.60209525405604
1821
+ - task:
1822
+ type: Retrieval
1823
+ dataset:
1824
+ type: scifact
1825
+ name: MTEB SciFact
1826
+ config: default
1827
+ split: test
1828
+ metrics:
1829
+ - type: map_at_1
1830
+ value: 51.456
1831
+ - type: map_at_10
1832
+ value: 60.827
1833
+ - type: map_at_100
1834
+ value: 61.595
1835
+ - type: map_at_1000
1836
+ value: 61.629999999999995
1837
+ - type: map_at_3
1838
+ value: 57.518
1839
+ - type: map_at_5
1840
+ value: 59.435
1841
+ - type: ndcg_at_1
1842
+ value: 53.333
1843
+ - type: ndcg_at_10
1844
+ value: 65.57
1845
+ - type: ndcg_at_100
1846
+ value: 68.911
1847
+ - type: ndcg_at_1000
1848
+ value: 69.65299999999999
1849
+ - type: ndcg_at_3
1850
+ value: 60.009
1851
+ - type: ndcg_at_5
1852
+ value: 62.803
1853
+ - type: precision_at_1
1854
+ value: 53.333
1855
+ - type: precision_at_10
1856
+ value: 8.933
1857
+ - type: precision_at_100
1858
+ value: 1.0699999999999998
1859
+ - type: precision_at_1000
1860
+ value: 0.11299999999999999
1861
+ - type: precision_at_3
1862
+ value: 23.333000000000002
1863
+ - type: precision_at_5
1864
+ value: 15.8
1865
+ - type: recall_at_1
1866
+ value: 51.456
1867
+ - type: recall_at_10
1868
+ value: 79.011
1869
+ - type: recall_at_100
1870
+ value: 94.167
1871
+ - type: recall_at_1000
1872
+ value: 99.667
1873
+ - type: recall_at_3
1874
+ value: 64.506
1875
+ - type: recall_at_5
1876
+ value: 71.211
1877
+ - task:
1878
+ type: PairClassification
1879
+ dataset:
1880
+ type: mteb/sprintduplicatequestions-pairclassification
1881
+ name: MTEB SprintDuplicateQuestions
1882
+ config: default
1883
+ split: test
1884
+ metrics:
1885
+ - type: cos_sim_accuracy
1886
+ value: 99.65940594059406
1887
+ - type: cos_sim_ap
1888
+ value: 90.1455141683116
1889
+ - type: cos_sim_f1
1890
+ value: 82.26044226044226
1891
+ - type: cos_sim_precision
1892
+ value: 80.8695652173913
1893
+ - type: cos_sim_recall
1894
+ value: 83.7
1895
+ - type: dot_accuracy
1896
+ value: 99.65940594059406
1897
+ - type: dot_ap
1898
+ value: 90.1455141683116
1899
+ - type: dot_f1
1900
+ value: 82.26044226044226
1901
+ - type: dot_precision
1902
+ value: 80.8695652173913
1903
+ - type: dot_recall
1904
+ value: 83.7
1905
+ - type: euclidean_accuracy
1906
+ value: 99.65940594059406
1907
+ - type: euclidean_ap
1908
+ value: 90.14551416831162
1909
+ - type: euclidean_f1
1910
+ value: 82.26044226044226
1911
+ - type: euclidean_precision
1912
+ value: 80.8695652173913
1913
+ - type: euclidean_recall
1914
+ value: 83.7
1915
+ - type: manhattan_accuracy
1916
+ value: 99.64950495049504
1917
+ - type: manhattan_ap
1918
+ value: 89.5492617367771
1919
+ - type: manhattan_f1
1920
+ value: 81.58280410356619
1921
+ - type: manhattan_precision
1922
+ value: 79.75167144221585
1923
+ - type: manhattan_recall
1924
+ value: 83.5
1925
+ - type: max_accuracy
1926
+ value: 99.65940594059406
1927
+ - type: max_ap
1928
+ value: 90.14551416831162
1929
+ - type: max_f1
1930
+ value: 82.26044226044226
1931
+ - task:
1932
+ type: Clustering
1933
+ dataset:
1934
+ type: mteb/stackexchange-clustering
1935
+ name: MTEB StackExchangeClustering
1936
+ config: default
1937
+ split: test
1938
+ metrics:
1939
+ - type: v_measure
1940
+ value: 53.80048409076929
1941
+ - task:
1942
+ type: Clustering
1943
+ dataset:
1944
+ type: mteb/stackexchange-clustering-p2p
1945
+ name: MTEB StackExchangeClusteringP2P
1946
+ config: default
1947
+ split: test
1948
+ metrics:
1949
+ - type: v_measure
1950
+ value: 34.280269334397545
1951
+ - task:
1952
+ type: Reranking
1953
+ dataset:
1954
+ type: mteb/stackoverflowdupquestions-reranking
1955
+ name: MTEB StackOverflowDupQuestions
1956
+ config: default
1957
+ split: test
1958
+ metrics:
1959
+ - type: map
1960
+ value: 51.97907654945493
1961
+ - type: mrr
1962
+ value: 52.82873376623376
1963
+ - task:
1964
+ type: Summarization
1965
+ dataset:
1966
+ type: mteb/summeval
1967
+ name: MTEB SummEval
1968
+ config: default
1969
+ split: test
1970
+ metrics:
1971
+ - type: cos_sim_pearson
1972
+ value: 28.364293841556304
1973
+ - type: cos_sim_spearman
1974
+ value: 27.485869639926136
1975
+ - type: dot_pearson
1976
+ value: 28.364295910221145
1977
+ - type: dot_spearman
1978
+ value: 27.485869639926136
1979
+ - task:
1980
+ type: Retrieval
1981
+ dataset:
1982
+ type: trec-covid
1983
+ name: MTEB TRECCOVID
1984
+ config: default
1985
+ split: test
1986
+ metrics:
1987
+ - type: map_at_1
1988
+ value: 0.19499999999999998
1989
+ - type: map_at_10
1990
+ value: 1.218
1991
+ - type: map_at_100
1992
+ value: 7.061000000000001
1993
+ - type: map_at_1000
1994
+ value: 19.735
1995
+ - type: map_at_3
1996
+ value: 0.46499999999999997
1997
+ - type: map_at_5
1998
+ value: 0.672
1999
+ - type: ndcg_at_1
2000
+ value: 60.0
2001
+ - type: ndcg_at_10
2002
+ value: 51.32600000000001
2003
+ - type: ndcg_at_100
2004
+ value: 41.74
2005
+ - type: ndcg_at_1000
2006
+ value: 43.221
2007
+ - type: ndcg_at_3
2008
+ value: 54.989
2009
+ - type: ndcg_at_5
2010
+ value: 52.905
2011
+ - type: precision_at_1
2012
+ value: 66.0
2013
+ - type: precision_at_10
2014
+ value: 55.60000000000001
2015
+ - type: precision_at_100
2016
+ value: 43.34
2017
+ - type: precision_at_1000
2018
+ value: 19.994
2019
+ - type: precision_at_3
2020
+ value: 59.333000000000006
2021
+ - type: precision_at_5
2022
+ value: 57.199999999999996
2023
+ - type: recall_at_1
2024
+ value: 0.19499999999999998
2025
+ - type: recall_at_10
2026
+ value: 1.473
2027
+ - type: recall_at_100
2028
+ value: 10.596
2029
+ - type: recall_at_1000
2030
+ value: 42.466
2031
+ - type: recall_at_3
2032
+ value: 0.49899999999999994
2033
+ - type: recall_at_5
2034
+ value: 0.76
2035
+ - task:
2036
+ type: Retrieval
2037
+ dataset:
2038
+ type: webis-touche2020
2039
+ name: MTEB Touche2020
2040
+ config: default
2041
+ split: test
2042
+ metrics:
2043
+ - type: map_at_1
2044
+ value: 1.997
2045
+ - type: map_at_10
2046
+ value: 7.5569999999999995
2047
+ - type: map_at_100
2048
+ value: 12.238
2049
+ - type: map_at_1000
2050
+ value: 13.773
2051
+ - type: map_at_3
2052
+ value: 4.334
2053
+ - type: map_at_5
2054
+ value: 5.5
2055
+ - type: ndcg_at_1
2056
+ value: 22.448999999999998
2057
+ - type: ndcg_at_10
2058
+ value: 19.933999999999997
2059
+ - type: ndcg_at_100
2060
+ value: 30.525999999999996
2061
+ - type: ndcg_at_1000
2062
+ value: 43.147999999999996
2063
+ - type: ndcg_at_3
2064
+ value: 22.283
2065
+ - type: ndcg_at_5
2066
+ value: 21.224
2067
+ - type: precision_at_1
2068
+ value: 24.490000000000002
2069
+ - type: precision_at_10
2070
+ value: 17.551
2071
+ - type: precision_at_100
2072
+ value: 6.4079999999999995
2073
+ - type: precision_at_1000
2074
+ value: 1.463
2075
+ - type: precision_at_3
2076
+ value: 23.128999999999998
2077
+ - type: precision_at_5
2078
+ value: 20.816000000000003
2079
+ - type: recall_at_1
2080
+ value: 1.997
2081
+ - type: recall_at_10
2082
+ value: 13.001999999999999
2083
+ - type: recall_at_100
2084
+ value: 40.98
2085
+ - type: recall_at_1000
2086
+ value: 79.40899999999999
2087
+ - type: recall_at_3
2088
+ value: 5.380999999999999
2089
+ - type: recall_at_5
2090
+ value: 7.721
2091
+ - task:
2092
+ type: Classification
2093
+ dataset:
2094
+ type: mteb/toxic_conversations_50k
2095
+ name: MTEB ToxicConversationsClassification
2096
+ config: default
2097
+ split: test
2098
+ metrics:
2099
+ - type: accuracy
2100
+ value: 60.861200000000004
2101
+ - type: ap
2102
+ value: 11.39641747026629
2103
+ - type: f1
2104
+ value: 47.80230380517065
2105
+ - task:
2106
+ type: Classification
2107
+ dataset:
2108
+ type: mteb/tweet_sentiment_extraction
2109
+ name: MTEB TweetSentimentExtractionClassification
2110
+ config: default
2111
+ split: test
2112
+ metrics:
2113
+ - type: accuracy
2114
+ value: 55.464063384267114
2115
+ - type: f1
2116
+ value: 55.759039643764666
2117
+ - task:
2118
+ type: Clustering
2119
+ dataset:
2120
+ type: mteb/twentynewsgroups-clustering
2121
+ name: MTEB TwentyNewsgroupsClustering
2122
+ config: default
2123
+ split: test
2124
+ metrics:
2125
+ - type: v_measure
2126
+ value: 49.74455348083809
2127
+ - task:
2128
+ type: PairClassification
2129
+ dataset:
2130
+ type: mteb/twittersemeval2015-pairclassification
2131
+ name: MTEB TwitterSemEval2015
2132
+ config: default
2133
+ split: test
2134
+ metrics:
2135
+ - type: cos_sim_accuracy
2136
+ value: 86.07617571675507
2137
+ - type: cos_sim_ap
2138
+ value: 73.85398650568216
2139
+ - type: cos_sim_f1
2140
+ value: 68.50702798531087
2141
+ - type: cos_sim_precision
2142
+ value: 65.86316045775506
2143
+ - type: cos_sim_recall
2144
+ value: 71.37203166226914
2145
+ - type: dot_accuracy
2146
+ value: 86.07617571675507
2147
+ - type: dot_ap
2148
+ value: 73.85398346238429
2149
+ - type: dot_f1
2150
+ value: 68.50702798531087
2151
+ - type: dot_precision
2152
+ value: 65.86316045775506
2153
+ - type: dot_recall
2154
+ value: 71.37203166226914
2155
+ - type: euclidean_accuracy
2156
+ value: 86.07617571675507
2157
+ - type: euclidean_ap
2158
+ value: 73.85398625060357
2159
+ - type: euclidean_f1
2160
+ value: 68.50702798531087
2161
+ - type: euclidean_precision
2162
+ value: 65.86316045775506
2163
+ - type: euclidean_recall
2164
+ value: 71.37203166226914
2165
+ - type: manhattan_accuracy
2166
+ value: 85.98676759849795
2167
+ - type: manhattan_ap
2168
+ value: 73.86874126878737
2169
+ - type: manhattan_f1
2170
+ value: 68.55096559662361
2171
+ - type: manhattan_precision
2172
+ value: 66.51774633904195
2173
+ - type: manhattan_recall
2174
+ value: 70.71240105540898
2175
+ - type: max_accuracy
2176
+ value: 86.07617571675507
2177
+ - type: max_ap
2178
+ value: 73.86874126878737
2179
+ - type: max_f1
2180
+ value: 68.55096559662361
2181
+ - task:
2182
+ type: PairClassification
2183
+ dataset:
2184
+ type: mteb/twitterurlcorpus-pairclassification
2185
+ name: MTEB TwitterURLCorpus
2186
+ config: default
2187
+ split: test
2188
+ metrics:
2189
+ - type: cos_sim_accuracy
2190
+ value: 88.51631932316529
2191
+ - type: cos_sim_ap
2192
+ value: 85.10831084479727
2193
+ - type: cos_sim_f1
2194
+ value: 77.14563397129186
2195
+ - type: cos_sim_precision
2196
+ value: 74.9709386806161
2197
+ - type: cos_sim_recall
2198
+ value: 79.45026178010471
2199
+ - type: dot_accuracy
2200
+ value: 88.51631932316529
2201
+ - type: dot_ap
2202
+ value: 85.10831188797107
2203
+ - type: dot_f1
2204
+ value: 77.14563397129186
2205
+ - type: dot_precision
2206
+ value: 74.9709386806161
2207
+ - type: dot_recall
2208
+ value: 79.45026178010471
2209
+ - type: euclidean_accuracy
2210
+ value: 88.51631932316529
2211
+ - type: euclidean_ap
2212
+ value: 85.10829618408616
2213
+ - type: euclidean_f1
2214
+ value: 77.14563397129186
2215
+ - type: euclidean_precision
2216
+ value: 74.9709386806161
2217
+ - type: euclidean_recall
2218
+ value: 79.45026178010471
2219
+ - type: manhattan_accuracy
2220
+ value: 88.50467652423643
2221
+ - type: manhattan_ap
2222
+ value: 85.08329502055064
2223
+ - type: manhattan_f1
2224
+ value: 77.11157455683002
2225
+ - type: manhattan_precision
2226
+ value: 74.67541834968263
2227
+ - type: manhattan_recall
2228
+ value: 79.71204188481676
2229
+ - type: max_accuracy
2230
+ value: 88.51631932316529
2231
+ - type: max_ap
2232
+ value: 85.10831188797107
2233
+ - type: max_f1
2234
+ value: 77.14563397129186
2235
  ---
2236
 
2237