izhx commited on
Commit
b78e63e
1 Parent(s): 929b134

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -2614,7 +2614,7 @@ The models are built upon the `transformer++` encoder [backbone](https://hugging
2614
  The `gte-v1.5` series achieve state-of-the-art scores on the MTEB benchmark within the same model size category and prodvide competitive on the LoCo long-context retrieval tests (refer to [Evaluation](#evaluation)).
2615
 
2616
  We also present the [`gte-Qwen1.5-7B-instruct`](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct),
2617
- a SOTA instruction-tuned bilingual embedding model that ranked 2nd in MTEB and 1st in C-MTEB.
2618
 
2619
  <!-- Provide a longer summary of what this model is. -->
2620
 
@@ -2628,7 +2628,7 @@ a SOTA instruction-tuned bilingual embedding model that ranked 2nd in MTEB and 1
2628
 
2629
  | Models | Language | Model Size | Max Seq. Length | Dimension | MTEB-en | LoCo |
2630
  |:-----: | :-----: |:-----: |:-----: |:-----: | :-----: | :-----: |
2631
- |[`gte-Qwen1.5-7B-instruct`](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct)| Chinese, English | 7720 | 32768 | 4096 | 67.34 | 87.57 |
2632
  |[`gte-large-en-v1.5`](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | English | 434 | 8192 | 1024 | 65.39 | 86.71 |
2633
  |[`gte-base-en-v1.5`](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) | English | 137 | 8192 | 768 | 64.11 | 87.44 |
2634
 
 
2614
  The `gte-v1.5` series achieve state-of-the-art scores on the MTEB benchmark within the same model size category and prodvide competitive on the LoCo long-context retrieval tests (refer to [Evaluation](#evaluation)).
2615
 
2616
  We also present the [`gte-Qwen1.5-7B-instruct`](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct),
2617
+ a SOTA instruction-tuned multi-lingual embedding model that ranked 2nd in MTEB and 1st in C-MTEB.
2618
 
2619
  <!-- Provide a longer summary of what this model is. -->
2620
 
 
2628
 
2629
  | Models | Language | Model Size | Max Seq. Length | Dimension | MTEB-en | LoCo |
2630
  |:-----: | :-----: |:-----: |:-----: |:-----: | :-----: | :-----: |
2631
+ |[`gte-Qwen1.5-7B-instruct`](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct)| Multiple | 7720 | 32768 | 4096 | 67.34 | 87.57 |
2632
  |[`gte-large-en-v1.5`](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | English | 434 | 8192 | 1024 | 65.39 | 86.71 |
2633
  |[`gte-base-en-v1.5`](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5) | English | 137 | 8192 | 768 | 64.11 | 87.44 |
2634