Edit model card

SentenceTransformer based on nomic-ai/nomic-embed-text-v1.5

This is a sentence-transformers model finetuned from nomic-ai/nomic-embed-text-v1.5 on the triplets and pairs datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: nomic-ai/nomic-embed-text-v1.5
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Training Datasets:
    • triplets
    • pairs

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NomicBertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'search_query: makeup',
    'search_query: make up',
    'search_query: hyundai tucson rims',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Triplet

Metric Value
cosine_accuracy 0.724
dot_accuracy 0.2849
manhattan_accuracy 0.7206
euclidean_accuracy 0.7224
max_accuracy 0.724

Semantic Similarity

Metric Value
pearson_cosine 0.5105
spearman_cosine 0.4916
pearson_manhattan 0.4556
spearman_manhattan 0.4447
pearson_euclidean 0.4572
spearman_euclidean 0.4466
pearson_dot 0.4938
spearman_dot 0.4819
pearson_max 0.5105
spearman_max 0.4916

Training Details

Training Datasets

triplets

  • Dataset: triplets
  • Size: 684,084 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 11.1 tokens
    • max: 22 tokens
    • min: 17 tokens
    • mean: 42.75 tokens
    • max: 95 tokens
    • min: 15 tokens
    • mean: 43.8 tokens
    • max: 127 tokens
  • Samples:
    anchor positive negative
    search_query: tarps heavy duty waterproof 8x10 search_document: 8' x 10' Super Heavy Duty 16 Mil Brown Poly Tarp Cover - Thick Waterproof, UV Resistant, Rip and Tear Proof Tarpaulin with Grommets and Reinforced Edges - by Xpose Safety, Xpose Safety, Brown search_document: Grillkid 6'X8' 4.5 Mil Thick General Purpose Waterproof Poly Tarp, Grillkid, All Purpose
    search_query: wireless keyboard without number pad search_document: Macally 2.4G Small Wireless Keyboard - Ergonomic & Comfortable Computer Keyboard - Compact Keyboard for Laptop or Windows PC Desktop, Tablet, Smart TV - Plug & Play Mini Keyboard with 12 Hot Keys, Macally, Black search_document: Wireless Keyboard - iClever GKA22S Rechargeable Keyboard with Number Pad, Full-Size Stainless Steel Ultra Slim Keyboard, 2.4G Stable Connection Wireless Keyboard for iMac, Mackbook, PC, Laptop, iClever, Silver
    search_query: geometry earrings search_document: Simple Stud Earrings for Women, Geometric Minimalist Stud Earring Set Tiny Circle Triangle Square Bar Stud Earrings Mini Cartilage Tragus Earrings, choice of all, B:Circle Sliver search_document: BONALUNA Bohemian Wood And Marble Effect Oblong Shaped Drop Statement Earrings (VIVID TURQUOISE), BONALUNA, VIVID TURQUOISE
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

pairs

  • Dataset: pairs
  • Size: 498,114 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.73 tokens
    • max: 33 tokens
    • min: 10 tokens
    • mean: 40.14 tokens
    • max: 98 tokens
    • min: 0.0
    • mean: 0.81
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    I would choose a medium weight waterproof fabric, hip length jacket or longer, long sleeves, zip front, with a hood and deep pockets with zips ZSHOW Men's Winter Hooded Packable Down Jacket(Blue, XX-Large), ZSHOW, Blue 1.0
    sequin dance costume girls Yeahdor Big Girls' Lyrical Latin Ballet Dance Costumes Dresses Halter Sequins Irregular Tutu Skirted Leotard Dancewear Pink 12-14, Yeahdor, Pink 1.0
    paint easel bulk Artecho Artist Easel Display Easel Stand, 2 Pack Metal Tripod Stand Easel for Painting, Hold Canvas from 21" to 66", Floor and Tabletop Displaying, Painting with Portable Bag, Artecho, Black 1.0
  • Loss: AnglELoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_angle_sim"
    }
    

Evaluation Datasets

triplets

  • Dataset: triplets
  • Size: 10,000 evaluation samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 11.13 tokens
    • max: 23 tokens
    • min: 15 tokens
    • mean: 43.11 tokens
    • max: 107 tokens
    • min: 15 tokens
    • mean: 43.56 tokens
    • max: 99 tokens
  • Samples:
    anchor positive negative
    search_query: hitch fifth wheel search_document: ENIXWILL 5th Wheel Trailer Hitch Lifting Device Bracket Pin Fit for Hitch Companion and Patriot Series Hitch, ENIXWILL, Black search_document: ECOTRIC Fifth 5th Wheel Trailer Hitch Mount Rails and Installation Kits for Full-Size Trucks, ECOTRIC, black
    search_query: dek pro search_document: Cubiker Computer Desk 47 inch Home Office Writing Study Desk, Modern Simple Style Laptop Table with Storage Bag, Brown, Cubiker, Brown search_document: FEZIBO Dual Motor L Shaped Electric Standing Desk, 48 Inches Stand Up Corner Desk, Home Office Sit Stand Desk with Rustic Brown Top and Black Frame, FEZIBO, Rustic Brown
    search_query: 1 year baby mouth without teeth cleaner search_document: Baby Toothbrush,Infant Toothbrush,Baby Tongue Cleaner,Infant Toothbrush,Baby Tongue Cleaner Newborn,Toothbrush Tongue Cleaner Dental Care for 0-36 Month Baby,36 Pcs + Free 4 Pcs, Babycolor, Blue search_document: Slotic Baby Toothbrush for 0-2 Years, Safe and Sturdy, Toddler Oral Care Teether Brush, Extra Soft Bristle for Baby Teeth and Infant Gums, Dentist Recommended (4-Pack), Slotic, 4 Pack
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

pairs

  • Dataset: pairs
  • Size: 10,000 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.8 tokens
    • max: 34 tokens
    • min: 9 tokens
    • mean: 39.7 tokens
    • max: 101 tokens
    • min: 0.0
    • mean: 0.77
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    outdoor ceiling fans without light 44" Plaza Industrial Indoor Outdoor Ceiling Fan with Remote Control Oil Rubbed Bronze Damp Rated for Patio Porch - Casa Vieja, Casa Vieja, No Light Kit - Bronze 1.0
    bathroom cabinet Homfa Bathroom Floor Cabinet Free Standing with Single Door Multifunctional Bathroom Storage Organizer Toiletries(Ivory White), Homfa, White 1.0
    fitbit charge 3 TreasureMax Compatible with Fitbit Charge 2 Bands for Women/Men,Silicone Fadeless Pattern Printed Replacement Floral Bands for Fitbit Charge 2 HR Wristbands, TreasureMax, Paw 2 0.4
  • Loss: AnglELoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_angle_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • gradient_accumulation_steps: 4
  • learning_rate: 1e-06
  • num_train_epochs: 5
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_kwargs: {'num_cycles': 4}
  • warmup_ratio: 0.01
  • dataloader_drop_last: True
  • dataloader_num_workers: 4
  • dataloader_prefetch_factor: 2
  • load_best_model_at_end: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • prediction_loss_only: True
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 4
  • eval_accumulation_steps: None
  • learning_rate: 1e-06
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_kwargs: {'num_cycles': 4}
  • warmup_ratio: 0.01
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 4
  • dataloader_prefetch_factor: 2
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss pairs loss triplets loss cosine_accuracy spearman_cosine
0.0014 100 0.8207 - - - -
0.0027 200 0.9003 - - - -
0.0041 300 0.8379 - - - -
0.0054 400 0.815 - - - -
0.0068 500 0.8981 - - - -
0.0081 600 0.9957 - - - -
0.0095 700 0.8284 - - - -
0.0108 800 0.8095 - - - -
0.0122 900 0.9307 - - - -
0.0135 1000 0.9906 1.3590 0.6927 0.694 0.3576
0.0149 1100 0.8519 - - - -
0.0162 1200 0.738 - - - -
0.0176 1300 0.9221 - - - -
0.0189 1400 0.8652 - - - -
0.0203 1500 0.8599 - - - -
0.0217 1600 0.8376 - - - -
0.0230 1700 0.8015 - - - -
0.0244 1800 0.8402 - - - -
0.0257 1900 0.8278 - - - -
0.0271 2000 0.9169 1.2825 0.6685 0.6984 0.3827
0.0284 2100 0.8237 - - - -
0.0298 2200 0.6999 - - - -
0.0311 2300 0.8482 - - - -
0.0325 2400 0.7317 - - - -
0.0338 2500 0.8562 - - - -
0.0352 2600 0.7919 - - - -
0.0365 2700 0.8009 - - - -
0.0379 2800 0.7552 - - - -
0.0392 2900 0.8148 - - - -
0.0406 3000 0.7556 1.2029 0.6480 0.7064 0.4045
0.0420 3100 0.6813 - - - -
0.0433 3200 0.7406 - - - -
0.0447 3300 0.8198 - - - -
0.0460 3400 0.7842 - - - -
0.0474 3500 0.74 - - - -
0.0487 3600 0.7117 - - - -
0.0501 3700 0.7404 - - - -
0.0514 3800 0.6719 - - - -
0.0528 3900 0.6728 - - - -
0.0541 4000 0.7189 1.0997 0.6337 0.7146 0.4284
0.0555 4100 0.7812 - - - -
0.0568 4200 0.7474 - - - -
0.0582 4300 0.6556 - - - -
0.0596 4400 0.8303 - - - -
0.0609 4500 0.6796 - - - -
0.0623 4600 0.7077 - - - -
0.0636 4700 0.6863 - - - -
0.0650 4800 0.6756 - - - -
0.0663 4900 0.6955 - - - -
0.0677 5000 0.7199 1.0589 0.6257 0.7161 0.4426
0.0690 5100 0.6744 - - - -
0.0704 5200 0.7609 - - - -
0.0717 5300 0.6707 - - - -
0.0731 5400 0.6796 - - - -
0.0744 5500 0.6842 - - - -
0.0758 5600 0.7358 - - - -
0.0771 5700 0.7578 - - - -
0.0785 5800 0.6822 - - - -
0.0799 5900 0.6847 - - - -
0.0812 6000 0.7556 1.0383 0.6199 0.7168 0.4488
0.0826 6100 0.7013 - - - -
0.0839 6200 0.6728 - - - -
0.0853 6300 0.6418 - - - -
0.0866 6400 0.6918 - - - -
0.0880 6500 0.7399 - - - -
0.0893 6600 0.7896 - - - -
0.0907 6700 0.6771 - - - -
0.0920 6800 0.6429 - - - -
0.0934 6900 0.6806 - - - -
0.0947 7000 0.6931 1.0354 0.6176 0.7195 0.4561
0.0961 7100 0.7115 - - - -
0.0974 7200 0.6108 - - - -
0.0988 7300 0.6889 - - - -
0.1002 7400 0.6451 - - - -
0.1015 7500 0.6501 - - - -
0.1029 7600 0.699 - - - -
0.1042 7700 0.6624 - - - -
0.1056 7800 0.7075 - - - -
0.1069 7900 0.6789 - - - -
0.1083 8000 0.6572 1.0391 0.6183 0.7211 0.4544
0.1096 8100 0.6754 - - - -
0.1110 8200 0.6404 - - - -
0.1123 8300 0.6816 - - - -
0.1137 8400 0.6485 - - - -
0.1150 8500 0.6794 - - - -
0.1164 8600 0.693 - - - -
0.1177 8700 0.5798 - - - -
0.1191 8800 0.7063 - - - -
0.1205 8900 0.6192 - - - -
0.1218 9000 0.6889 1.0438 0.6175 0.7243 0.4580
0.1232 9100 0.6881 - - - -
0.1245 9200 0.6369 - - - -
0.1259 9300 0.6451 - - - -
0.1272 9400 0.644 - - - -
0.1286 9500 0.7059 - - - -
0.1299 9600 0.5983 - - - -
0.1313 9700 0.5935 - - - -
0.1326 9800 0.634 - - - -
0.1340 9900 0.6716 - - - -
0.1353 10000 0.6591 1.0213 0.6132 0.7231 0.4640
0.1367 10100 0.6886 - - - -
0.1380 10200 0.6133 - - - -
0.1394 10300 0.5871 - - - -
0.1408 10400 0.5949 - - - -
0.1421 10500 0.6356 - - - -
0.1435 10600 0.6379 - - - -
0.1448 10700 0.6288 - - - -
0.1462 10800 0.6732 - - - -
0.1475 10900 0.6515 - - - -
0.1489 11000 0.7013 1.0164 0.6123 0.7257 0.4629
0.1502 11100 0.5848 - - - -
0.1516 11200 0.5988 - - - -
0.1529 11300 0.7331 - - - -
0.1543 11400 0.6089 - - - -
0.1556 11500 0.6553 - - - -
0.1570 11600 0.654 - - - -
0.1583 11700 0.6509 - - - -
0.1597 11800 0.6187 - - - -
0.1611 11900 0.6448 - - - -
0.1624 12000 0.6775 1.0087 0.6137 0.7257 0.4687
0.1638 12100 0.5793 - - - -
0.1651 12200 0.6827 - - - -
0.1665 12300 0.6002 - - - -
0.1678 12400 0.583 - - - -
0.1692 12500 0.6342 - - - -
0.1705 12600 0.6378 - - - -
0.1719 12700 0.6008 - - - -
0.1732 12800 0.6778 - - - -
0.1746 12900 0.6637 - - - -
0.1759 13000 0.6419 1.0117 0.6126 0.7234 0.4705
0.1773 13100 0.663 - - - -
0.1787 13200 0.5404 - - - -
0.1800 13300 0.6427 - - - -
0.1814 13400 0.6907 - - - -
0.1827 13500 0.63 - - - -
0.1841 13600 0.6501 - - - -
0.1854 13700 0.6124 - - - -
0.1868 13800 0.6381 - - - -
0.1881 13900 0.6324 - - - -
0.1895 14000 0.6542 1.0119 0.6126 0.7253 0.4641
0.1908 14100 0.6292 - - - -
0.1922 14200 0.6214 - - - -
0.1935 14300 0.643 - - - -
0.1949 14400 0.6094 - - - -
0.1962 14500 0.5929 - - - -
0.1976 14600 0.7236 - - - -
0.1990 14700 0.5857 - - - -
0.2003 14800 0.7177 - - - -
0.2017 14900 0.6651 - - - -
0.2030 15000 0.6197 1.0012 0.6098 0.727 0.4724
0.2044 15100 0.6128 - - - -
0.2057 15200 0.6281 - - - -
0.2071 15300 0.7106 - - - -
0.2084 15400 0.6095 - - - -
0.2098 15500 0.5855 - - - -
0.2111 15600 0.6124 - - - -
0.2125 15700 0.6233 - - - -
0.2138 15800 0.6511 - - - -
0.2152 15900 0.5701 - - - -
0.2165 16000 0.6011 0.9990 0.6083 0.7261 0.4756
0.2179 16100 0.5907 - - - -
0.2193 16200 0.599 - - - -
0.2206 16300 0.5879 - - - -
0.2220 16400 0.5505 - - - -
0.2233 16500 0.721 - - - -
0.2247 16600 0.6972 - - - -
0.2260 16700 0.6147 - - - -
0.2274 16800 0.6147 - - - -
0.2287 16900 0.6217 - - - -
0.2301 17000 0.6048 1.0026 0.6097 0.7284 0.4700
0.2314 17100 0.6233 - - - -
0.2328 17200 0.5569 - - - -
0.2341 17300 0.6158 - - - -
0.2355 17400 0.6483 - - - -
0.2368 17500 0.5811 - - - -
0.2382 17600 0.5988 - - - -
0.2396 17700 0.5472 - - - -
0.2409 17800 0.515 - - - -
0.2423 17900 0.6188 - - - -
0.2436 18000 0.6179 1.0068 0.6109 0.727 0.4749
0.2450 18100 0.6492 - - - -
0.2463 18200 0.6303 - - - -
0.2477 18300 0.6875 - - - -
0.2490 18400 0.6421 - - - -
0.2504 18500 0.5463 - - - -
0.2517 18600 0.6061 - - - -
0.2531 18700 0.6271 - - - -
0.2544 18800 0.5899 - - - -
0.2558 18900 0.583 - - - -
0.2571 19000 0.5725 1.0107 0.6102 0.7282 0.4717
0.2585 19100 0.578 - - - -
0.2599 19200 0.649 - - - -
0.2612 19300 0.5673 - - - -
0.2626 19400 0.6736 - - - -
0.2639 19500 0.6257 - - - -
0.2653 19600 0.6759 - - - -
0.2666 19700 0.5767 - - - -
0.2680 19800 0.6644 - - - -
0.2693 19900 0.6232 - - - -
0.2707 20000 0.5403 1.0150 0.6096 0.7279 0.4799
0.2720 20100 0.6195 - - - -
0.2734 20200 0.6111 - - - -
0.2747 20300 0.6524 - - - -
0.2761 20400 0.5863 - - - -
0.2774 20500 0.5788 - - - -
0.2788 20600 0.5401 - - - -
0.2802 20700 0.6166 - - - -
0.2815 20800 0.5687 - - - -
0.2829 20900 0.6352 - - - -
0.2842 21000 0.6574 1.0086 0.6104 0.7291 0.4772
0.2856 21100 0.633 - - - -
0.2869 21200 0.6008 - - - -
0.2883 21300 0.5929 - - - -
0.2896 21400 0.6791 - - - -
0.2910 21500 0.6044 - - - -
0.2923 21600 0.5487 - - - -
0.2937 21700 0.5302 - - - -
0.2950 21800 0.5842 - - - -
0.2964 21900 0.5931 - - - -
0.2978 22000 0.5376 1.0130 0.6114 0.7292 0.4803
0.2991 22100 0.511 - - - -
0.3005 22200 0.5989 - - - -
0.3018 22300 0.6184 - - - -
0.3032 22400 0.5367 - - - -
0.3045 22500 0.6855 - - - -
0.3059 22600 0.6058 - - - -
0.3072 22700 0.582 - - - -
0.3086 22800 0.5601 - - - -
0.3099 22900 0.6476 - - - -
0.3113 23000 0.5905 1.0174 0.6103 0.7294 0.4818
0.3126 23100 0.6215 - - - -
0.3140 23200 0.5134 - - - -
0.3153 23300 0.5508 - - - -
0.3167 23400 0.5855 - - - -
0.3181 23500 0.604 - - - -
0.3194 23600 0.6711 - - - -
0.3208 23700 0.6602 - - - -
0.3221 23800 0.5083 - - - -
0.3235 23900 0.5928 - - - -
0.3248 24000 0.5756 1.0079 0.6117 0.7304 0.4850
0.3262 24100 0.5659 - - - -
0.3275 24200 0.5664 - - - -
0.3289 24300 0.5622 - - - -
0.3302 24400 0.6685 - - - -
0.3316 24500 0.5807 - - - -
0.3329 24600 0.5583 - - - -
0.3343 24700 0.5634 - - - -
0.3356 24800 0.6452 - - - -
0.3370 24900 0.5716 - - - -
0.3384 25000 0.5411 1.0043 0.6116 0.7289 0.4851
0.3397 25100 0.583 - - - -
0.3411 25200 0.5801 - - - -
0.3424 25300 0.52 - - - -
0.3438 25400 0.5882 - - - -
0.3451 25500 0.5788 - - - -
0.3465 25600 0.6031 - - - -
0.3478 25700 0.5806 - - - -
0.3492 25800 0.541 - - - -
0.3505 25900 0.6236 - - - -
0.3519 26000 0.5642 1.0042 0.6124 0.7283 0.4798
0.3532 26100 0.5681 - - - -
0.3546 26200 0.5849 - - - -
0.3559 26300 0.5879 - - - -
0.3573 26400 0.5634 - - - -
0.3587 26500 0.5681 - - - -
0.3600 26600 0.6432 - - - -
0.3614 26700 0.5447 - - - -
0.3627 26800 0.5574 - - - -
0.3641 26900 0.5698 - - - -
0.3654 27000 0.6691 1.0087 0.6126 0.7286 0.4829
0.3668 27100 0.6235 - - - -
0.3681 27200 0.5478 - - - -
0.3695 27300 0.586 - - - -
0.3708 27400 0.5454 - - - -
0.3722 27500 0.5608 - - - -
0.3735 27600 0.6274 - - - -
0.3749 27700 0.5939 - - - -
0.3762 27800 0.5673 - - - -
0.3776 27900 0.5784 - - - -
0.3790 28000 0.6069 1.0183 0.6126 0.7295 0.4798
0.3803 28100 0.5733 - - - -
0.3817 28200 0.6075 - - - -
0.3830 28300 0.5933 - - - -
0.3844 28400 0.5907 - - - -
0.3857 28500 0.5869 - - - -
0.3871 28600 0.5781 - - - -
0.3884 28700 0.6056 - - - -
0.3898 28800 0.5676 - - - -
0.3911 28900 0.5997 - - - -
0.3925 29000 0.5936 1.0096 0.6135 0.7269 0.4866
0.3938 29100 0.5261 - - - -
0.3952 29200 0.53 - - - -
0.3966 29300 0.5153 - - - -
0.3979 29400 0.5161 - - - -
0.3993 29500 0.5723 - - - -
0.4006 29600 0.6247 - - - -
0.4020 29700 0.5521 - - - -
0.4033 29800 0.5528 - - - -
0.4047 29900 0.5917 - - - -
0.4060 30000 0.5267 1.0133 0.6117 0.7258 0.4869
0.4074 30100 0.6074 - - - -
0.4087 30200 0.5774 - - - -
0.4101 30300 0.5645 - - - -
0.4114 30400 0.5908 - - - -
0.4128 30500 0.5364 - - - -
0.4141 30600 0.5945 - - - -
0.4155 30700 0.5497 - - - -
0.4169 30800 0.5291 - - - -
0.4182 30900 0.5701 - - - -
0.4196 31000 0.5788 1.0041 0.6143 0.727 0.4870
0.4209 31100 0.6269 - - - -
0.4223 31200 0.4914 - - - -
0.4236 31300 0.5144 - - - -
0.4250 31400 0.6026 - - - -
0.4263 31500 0.5646 - - - -
0.4277 31600 0.6424 - - - -
0.4290 31700 0.5755 - - - -
0.4304 31800 0.5646 - - - -
0.4317 31900 0.573 - - - -
0.4331 32000 0.5648 1.0000 0.6133 0.7258 0.4867
0.4344 32100 0.5113 - - - -
0.4358 32200 0.5836 - - - -
0.4372 32300 0.6013 - - - -
0.4385 32400 0.5698 - - - -
0.4399 32500 0.5731 - - - -
0.4412 32600 0.489 - - - -
0.4426 32700 0.5728 - - - -
0.4439 32800 0.4829 - - - -
0.4453 32900 0.5783 - - - -
0.4466 33000 0.6191 1.0009 0.6162 0.7239 0.4863
0.4480 33100 0.5383 - - - -
0.4493 33200 0.5611 - - - -
0.4507 33300 0.5346 - - - -
0.4520 33400 0.5451 - - - -
0.4534 33500 0.5719 - - - -
0.4547 33600 0.5272 - - - -
0.4561 33700 0.5747 - - - -
0.4575 33800 0.509 - - - -
0.4588 33900 0.5746 - - - -
0.4602 34000 0.5873 0.9978 0.6142 0.7257 0.4914
0.4615 34100 0.5948 - - - -
0.4629 34200 0.5344 - - - -
0.4642 34300 0.5398 - - - -
0.4656 34400 0.6095 - - - -
0.4669 34500 0.5878 - - - -
0.4683 34600 0.5372 - - - -
0.4696 34700 0.5113 - - - -
0.4710 34800 0.5675 - - - -
0.4723 34900 0.5268 - - - -
0.4737 35000 0.4527 1.0195 0.6185 0.7254 0.4918
0.4750 35100 0.5625 - - - -
0.4764 35200 0.5786 - - - -
0.4778 35300 0.5327 - - - -
0.4791 35400 0.568 - - - -
0.4805 35500 0.5652 - - - -
0.4818 35600 0.61 - - - -
0.4832 35700 0.604 - - - -
0.4845 35800 0.6238 - - - -
0.4859 35900 0.5492 - - - -
0.4872 36000 0.5459 1.0140 0.6201 0.7237 0.4877
0.4886 36100 0.5833 - - - -
0.4899 36200 0.5663 - - - -
0.4913 36300 0.5248 - - - -
0.4926 36400 0.5352 - - - -
0.4940 36500 0.5271 - - - -
0.4953 36600 0.5142 - - - -
0.4967 36700 0.5173 - - - -
0.4981 36800 0.6029 - - - -
0.4994 36900 0.5732 - - - -
0.5008 37000 0.5887 1.0166 0.6182 0.7276 0.4938
0.5021 37100 0.529 - - - -
0.5035 37200 0.6251 - - - -
0.5048 37300 0.4641 - - - -
0.5062 37400 0.5818 - - - -
0.5075 37500 0.6206 - - - -
0.5089 37600 0.4771 - - - -
0.5102 37700 0.5578 - - - -
0.5116 37800 0.5857 - - - -
0.5129 37900 0.5658 - - - -
0.5143 38000 0.5514 1.0124 0.6188 0.727 0.4904
0.5157 38100 0.5092 - - - -
0.5170 38200 0.5495 - - - -
0.5184 38300 0.5263 - - - -
0.5197 38400 0.5399 - - - -
0.5211 38500 0.5643 - - - -
0.5224 38600 0.5608 - - - -
0.5238 38700 0.4812 - - - -
0.5251 38800 0.4792 - - - -
0.5265 38900 0.5185 - - - -
0.5278 39000 0.4966 1.0211 0.6196 0.7251 0.4902
0.5292 39100 0.6323 - - - -
0.5305 39200 0.4468 - - - -
0.5319 39300 0.6048 - - - -
0.5332 39400 0.4753 - - - -
0.5346 39500 0.5749 - - - -
0.5360 39600 0.5466 - - - -
0.5373 39700 0.5235 - - - -
0.5387 39800 0.5608 - - - -
0.5400 39900 0.5072 - - - -
0.5414 40000 0.5574 1.0107 0.6220 0.7272 0.4924
0.5427 40100 0.5694 - - - -
0.5441 40200 0.5462 - - - -
0.5454 40300 0.6253 - - - -
0.5468 40400 0.5736 - - - -
0.5481 40500 0.5225 - - - -
0.5495 40600 0.5313 - - - -
0.5508 40700 0.4789 - - - -
0.5522 40800 0.5424 - - - -
0.5535 40900 0.5282 - - - -
0.5549 41000 0.4923 1.0111 0.6215 0.7258 0.4906
0.5563 41100 0.5614 - - - -
0.5576 41200 0.552 - - - -
0.5590 41300 0.5455 - - - -
0.5603 41400 0.5593 - - - -
0.5617 41500 0.527 - - - -
0.5630 41600 0.5886 - - - -
0.5644 41700 0.5066 - - - -
0.5657 41800 0.6026 - - - -
0.5671 41900 0.5673 - - - -
0.5684 42000 0.5392 1.0095 0.6220 0.7261 0.4906
0.5698 42100 0.5483 - - - -
0.5711 42200 0.5596 - - - -
0.5725 42300 0.5462 - - - -
0.5738 42400 0.495 - - - -
0.5752 42500 0.4769 - - - -
0.5766 42600 0.6079 - - - -
0.5779 42700 0.5764 - - - -
0.5793 42800 0.5553 - - - -
0.5806 42900 0.4955 - - - -
0.5820 43000 0.568 1.0159 0.6221 0.7276 0.4926
0.5833 43100 0.4474 - - - -
0.5847 43200 0.5976 - - - -
0.5860 43300 0.5831 - - - -
0.5874 43400 0.4641 - - - -
0.5887 43500 0.5126 - - - -
0.5901 43600 0.5044 - - - -
0.5914 43700 0.5308 - - - -
0.5928 43800 0.5399 - - - -
0.5941 43900 0.5638 - - - -
0.5955 44000 0.5718 1.0135 0.6226 0.7268 0.4925
0.5969 44100 0.4601 - - - -
0.5982 44200 0.5542 - - - -
0.5996 44300 0.5645 - - - -
0.6009 44400 0.5284 - - - -
0.6023 44500 0.5632 - - - -
0.6036 44600 0.4867 - - - -
0.6050 44700 0.5773 - - - -
0.6063 44800 0.4619 - - - -
0.6077 44900 0.5044 - - - -
0.6090 45000 0.5379 1.0204 0.6268 0.7246 0.4889
0.6104 45100 0.4914 - - - -
0.6117 45200 0.5678 - - - -
0.6131 45300 0.5516 - - - -
0.6144 45400 0.5519 - - - -
0.6158 45500 0.4939 - - - -
0.6172 45600 0.4991 - - - -
0.6185 45700 0.4988 - - - -
0.6199 45800 0.5275 - - - -
0.6212 45900 0.51 - - - -
0.6226 46000 0.5478 1.0193 0.6250 0.726 0.4880
0.6239 46100 0.532 - - - -
0.6253 46200 0.5847 - - - -
0.6266 46300 0.5285 - - - -
0.6280 46400 0.4651 - - - -
0.6293 46500 0.5035 - - - -
0.6307 46600 0.6693 - - - -
0.6320 46700 0.4864 - - - -
0.6334 46800 0.5401 - - - -
0.6348 46900 0.5968 - - - -
0.6361 47000 0.5339 1.0217 0.6255 0.7261 0.4912
0.6375 47100 0.5183 - - - -
0.6388 47200 0.4989 - - - -
0.6402 47300 0.5263 - - - -
0.6415 47400 0.4698 - - - -
0.6429 47500 0.5878 - - - -
0.6442 47600 0.5186 - - - -
0.6456 47700 0.4365 - - - -
0.6469 47800 0.5596 - - - -
0.6483 47900 0.4989 - - - -
0.6496 48000 0.4629 1.0253 0.6279 0.7267 0.4903
0.6510 48100 0.4798 - - - -
0.6523 48200 0.541 - - - -
0.6537 48300 0.4916 - - - -
0.6551 48400 0.5228 - - - -
0.6564 48500 0.5612 - - - -
0.6578 48600 0.4756 - - - -
0.6591 48700 0.4542 - - - -
0.6605 48800 0.5226 - - - -
0.6618 48900 0.4651 - - - -
0.6632 49000 0.5673 1.0208 0.6264 0.7259 0.4934
0.6645 49100 0.6201 - - - -
0.6659 49200 0.5079 - - - -
0.6672 49300 0.5184 - - - -
0.6686 49400 0.4925 - - - -
0.6699 49500 0.5116 - - - -
0.6713 49600 0.5157 - - - -
0.6726 49700 0.5521 - - - -
0.6740 49800 0.5871 - - - -
0.6754 49900 0.5028 - - - -
0.6767 50000 0.4776 1.0173 0.6305 0.724 0.4916

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.0
  • Transformers: 4.38.2
  • PyTorch: 2.1.2+cu121
  • Accelerate: 0.27.2
  • Datasets: 2.19.1
  • Tokenizers: 0.15.2

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CachedMultipleNegativesRankingLoss

@misc{gao2021scaling,
    title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup}, 
    author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
    year={2021},
    eprint={2101.06983},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

AnglELoss

@misc{li2023angleoptimized,
    title={AnglE-optimized Text Embeddings}, 
    author={Xianming Li and Jing Li},
    year={2023},
    eprint={2309.12871},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
12
Safetensors
Model size
137M params
Tensor type
F32
·
Inference API
This model can be loaded on Inference API (serverless).

Finetuned from

Evaluation results