mpnet-base-gooaq / README.md
tomaarsen's picture
tomaarsen HF staff
Update README.md
67a5e14 verified
metadata
language:
  - en
license: apache-2.0
library_name: sentence-transformers
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - dataset_size:1M<n<10M
  - loss:MultipleNegativesRankingLoss
base_model: microsoft/mpnet-base
datasets:
  - sentence-transformers/gooaq
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
  - dot_accuracy@1
  - dot_accuracy@3
  - dot_accuracy@5
  - dot_accuracy@10
  - dot_precision@1
  - dot_precision@3
  - dot_precision@5
  - dot_precision@10
  - dot_recall@1
  - dot_recall@3
  - dot_recall@5
  - dot_recall@10
  - dot_ndcg@10
  - dot_mrr@10
  - dot_map@100
widget:
  - source_sentence: 11 is what of 8?
    sentences:
      - >-
        *RARE* CANDY AXE AND RED NOSED RAIDER IS BACK - FORTNITE ITEM SHOP 8TH
        DECEMBER 2019.
      - 'Convert fraction (ratio) 8 / 11 Answer: 72.727272727273%'
      - >-
        Old-age pensions are not included in taxable income under the personal
        income tax.
  - source_sentence: is 50 shades of grey on prime?
    sentences:
      - 'Amazon.com: Watch Fifty Shades of Grey. Prime Video.'
      - >-
        How much is 22 out of 100 written as a percentage? Convert fraction
        (ratio) 22 / 100 Answer: 22%
      - >-
        Petco ferrets are neutered and as social animals, they enjoy each
        other's company.
  - source_sentence: 20 of what is 18?
    sentences:
      - >-
        20 percent (calculated percentage %) of what number equals 18? Answer:
        90.
      - There are 3.35 x 1019 H2O molecules in a 1 mg snowflake.
      - >-
        There are 104 total Power Moons and 100 Purple Coins in the Mushroom
        Kingdom.
  - source_sentence: 63 up itv when is it on?
    sentences:
      - >-
        Mark Twain Quotes If you tell the truth, you don't have to remember
        anything.
      - >-
        63 Up is on ITV for three consecutive nights, Tuesday 4 – Thursday 6
        June, at 9pm.
      - In a language, the smallest units of meaning are. Morphemes.
  - source_sentence: what is ikit in tagalog?
    sentences:
      - >-
        Definition: aunt. the sister of one's father or mother; the wife of
        one's uncle (n.)
      - >-
        How much is 12 out of 29 written as a percentage? Convert fraction
        (ratio) 12 / 29 Answer: 41.379310344828%
      - >-
        Iberia offers Wi-Fi on all long-haul aircraft so that you can stay
        connected using your own devices.
pipeline_tag: sentence-similarity
co2_eq_emissions:
  emissions: 636.2415070661234
  energy_consumed: 1.636836206312608
  source: codecarbon
  training_type: fine-tuning
  on_cloud: false
  cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
  ram_total_size: 31.777088165283203
  hours_used: 4.514
  hardware_used: 1 x NVIDIA GeForce RTX 3090
model-index:
  - name: MPNet base trained on GooAQ Question-Answer tuples
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: gooaq dev
          type: gooaq-dev
        metrics:
          - type: cosine_accuracy@1
            value: 0.7198
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.884
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9305
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9709
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7198
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.29466666666666663
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1861
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09709000000000002
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7198
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.884
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9305
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9709
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8490972112228806
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8095713888888812
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8111457785591406
            name: Cosine Map@100
          - type: dot_accuracy@1
            value: 0.7073
            name: Dot Accuracy@1
          - type: dot_accuracy@3
            value: 0.877
            name: Dot Accuracy@3
          - type: dot_accuracy@5
            value: 0.9244
            name: Dot Accuracy@5
          - type: dot_accuracy@10
            value: 0.9669
            name: Dot Accuracy@10
          - type: dot_precision@1
            value: 0.7073
            name: Dot Precision@1
          - type: dot_precision@3
            value: 0.2923333333333333
            name: Dot Precision@3
          - type: dot_precision@5
            value: 0.18488000000000002
            name: Dot Precision@5
          - type: dot_precision@10
            value: 0.09669000000000003
            name: Dot Precision@10
          - type: dot_recall@1
            value: 0.7073
            name: Dot Recall@1
          - type: dot_recall@3
            value: 0.877
            name: Dot Recall@3
          - type: dot_recall@5
            value: 0.9244
            name: Dot Recall@5
          - type: dot_recall@10
            value: 0.9669
            name: Dot Recall@10
          - type: dot_ndcg@10
            value: 0.8412144933973646
            name: Dot Ndcg@10
          - type: dot_mrr@10
            value: 0.8004067857142795
            name: Dot Mrr@10
          - type: dot_map@100
            value: 0.8022667466578848
            name: Dot Map@100

MPNet base trained on GooAQ Question-Answer tuples

This is a sentence-transformers model finetuned from microsoft/mpnet-base on the sentence-transformers/gooaq dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

This model was trained using the train_script.py code.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: microsoft/mpnet-base
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Training Dataset:
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("tomaarsen/mpnet-base-gooaq")
# Run inference
sentences = [
    '11 is what of 8?',
    'Convert fraction (ratio) 8 / 11 Answer: 72.727272727273%',
    'Old-age pensions are not included in taxable income under the personal income tax.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.7198
cosine_accuracy@3 0.884
cosine_accuracy@5 0.9305
cosine_accuracy@10 0.9709
cosine_precision@1 0.7198
cosine_precision@3 0.2947
cosine_precision@5 0.1861
cosine_precision@10 0.0971
cosine_recall@1 0.7198
cosine_recall@3 0.884
cosine_recall@5 0.9305
cosine_recall@10 0.9709
cosine_ndcg@10 0.8491
cosine_mrr@10 0.8096
cosine_map@100 0.8111
dot_accuracy@1 0.7073
dot_accuracy@3 0.877
dot_accuracy@5 0.9244
dot_accuracy@10 0.9669
dot_precision@1 0.7073
dot_precision@3 0.2923
dot_precision@5 0.1849
dot_precision@10 0.0967
dot_recall@1 0.7073
dot_recall@3 0.877
dot_recall@5 0.9244
dot_recall@10 0.9669
dot_ndcg@10 0.8412
dot_mrr@10 0.8004
dot_map@100 0.8023

Training Details

Training Dataset

sentence-transformers/gooaq

  • Dataset: sentence-transformers/gooaq at b089f72
  • Size: 3,002,496 training samples
  • Columns: question and answer
  • Approximate statistics based on the first 1000 samples:
    question answer
    type string string
    details
    • min: 8 tokens
    • mean: 11.89 tokens
    • max: 22 tokens
    • min: 15 tokens
    • mean: 60.37 tokens
    • max: 147 tokens
  • Samples:
    question answer
    biotechnology is best defined as? Biotechnology is best defined as_______________? The science that involves using living organisms to produce needed materials. Which of the following tools of biotechnology, to do investigation, is used when trying crime?
    how to open xye file? Firstly, use File then Open and make sure that you can see All Files (*. *) and not just Excel files (the default option!) in the folder containing the *. xye file: Select the file you wish to open and Excel will bring up a wizard menu for importing plain text data into Excel (as shown below).
    how much does california spend? Estimated 2016 expenditures The total estimated government spending in California in fiscal year 2016 was $265.9 billion. Per-capita figures are calculated by taking the state's total spending and dividing by the number of state residents according to United States Census Bureau estimates.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

sentence-transformers/gooaq

  • Dataset: sentence-transformers/gooaq at b089f72
  • Size: 10,000 evaluation samples
  • Columns: question and answer
  • Approximate statistics based on the first 1000 samples:
    question answer
    type string string
    details
    • min: 8 tokens
    • mean: 11.86 tokens
    • max: 25 tokens
    • min: 14 tokens
    • mean: 60.82 tokens
    • max: 166 tokens
  • Samples:
    question answer
    how to open nx file? ['Click File > Open. The File Open dialog box opens.', 'Select NX File (*. prt) in the Type box. ... ', 'Select an NX . ... ', 'Select Import in the File Open dialog box. ... ', 'If you do not want to retain the import profile in use, select an import profile from the Profile list. ... ', 'Click OK in the Import New Model dialog box.']
    how to recover deleted photos from blackberry priv? ['Run Android Data Recovery. ... ', 'Enable USB Debugging Mode. ... ', 'Scan Your BlackBerry PRIV to Find Deleted Photos. ... ', 'Recover Deleted Photos from BlackBerry PRIV.']
    which subatomic particles are found within the nucleus of an atom? In the middle of every atom is the nucleus. The nucleus contains two types of subatomic particles, protons and neutrons. The protons have a positive electrical charge and the neutrons have no electrical charge. A third type of subatomic particle, electrons, move around the nucleus.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • bf16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss loss gooaq-dev_cosine_map@100
0 0 - - 0.1379
0.0000 1 3.6452 - -
0.0053 250 2.4418 - -
0.0107 500 0.373 - -
0.0160 750 0.183 - -
0.0213 1000 0.1286 0.0805 0.6796
0.0266 1250 0.1099 - -
0.0320 1500 0.091 - -
0.0373 1750 0.0768 - -
0.0426 2000 0.0665 0.0526 0.7162
0.0480 2250 0.0659 - -
0.0533 2500 0.0602 - -
0.0586 2750 0.0548 - -
0.0639 3000 0.0543 0.0426 0.7328
0.0693 3250 0.0523 - -
0.0746 3500 0.0494 - -
0.0799 3750 0.0468 - -
0.0853 4000 0.0494 0.0362 0.7450
0.0906 4250 0.048 - -
0.0959 4500 0.0442 - -
0.1012 4750 0.0442 - -
0.1066 5000 0.0408 0.0332 0.7519
0.1119 5250 0.0396 - -
0.1172 5500 0.0379 - -
0.1226 5750 0.0392 - -
0.1279 6000 0.0395 0.0300 0.7505
0.1332 6250 0.0349 - -
0.1386 6500 0.0383 - -
0.1439 6750 0.0335 - -
0.1492 7000 0.0323 0.0253 0.7624
0.1545 7250 0.0342 - -
0.1599 7500 0.0292 - -
0.1652 7750 0.0309 - -
0.1705 8000 0.0335 0.0249 0.7631
0.1759 8250 0.0304 - -
0.1812 8500 0.0318 - -
0.1865 8750 0.0271 - -
0.1918 9000 0.029 0.0230 0.7615
0.1972 9250 0.0309 - -
0.2025 9500 0.0305 - -
0.2078 9750 0.0237 - -
0.2132 10000 0.0274 0.0220 0.7667
0.2185 10250 0.0248 - -
0.2238 10500 0.0249 - -
0.2291 10750 0.0272 - -
0.2345 11000 0.0289 0.0230 0.7664
0.2398 11250 0.027 - -
0.2451 11500 0.0259 - -
0.2505 11750 0.0237 - -
0.2558 12000 0.0245 0.0220 0.7694
0.2611 12250 0.0251 - -
0.2664 12500 0.0243 - -
0.2718 12750 0.0229 - -
0.2771 13000 0.0273 0.0201 0.7725
0.2824 13250 0.0244 - -
0.2878 13500 0.0248 - -
0.2931 13750 0.0255 - -
0.2984 14000 0.0244 0.0192 0.7729
0.3037 14250 0.0242 - -
0.3091 14500 0.0235 - -
0.3144 14750 0.0231 - -
0.3197 15000 0.0228 0.0190 0.7823
0.3251 15250 0.0229 - -
0.3304 15500 0.0224 - -
0.3357 15750 0.0216 - -
0.3410 16000 0.0218 0.0186 0.7787
0.3464 16250 0.022 - -
0.3517 16500 0.0233 - -
0.3570 16750 0.0216 - -
0.3624 17000 0.0226 0.0169 0.7862
0.3677 17250 0.0215 - -
0.3730 17500 0.0212 - -
0.3784 17750 0.0178 - -
0.3837 18000 0.0217 0.0161 0.7813
0.3890 18250 0.0217 - -
0.3943 18500 0.0191 - -
0.3997 18750 0.0216 - -
0.4050 19000 0.022 0.0157 0.7868
0.4103 19250 0.0223 - -
0.4157 19500 0.021 - -
0.4210 19750 0.0176 - -
0.4263 20000 0.021 0.0162 0.7873
0.4316 20250 0.0206 - -
0.4370 20500 0.0196 - -
0.4423 20750 0.0186 - -
0.4476 21000 0.0197 0.0158 0.7907
0.4530 21250 0.0156 - -
0.4583 21500 0.0178 - -
0.4636 21750 0.0175 - -
0.4689 22000 0.0187 0.0151 0.7937
0.4743 22250 0.0182 - -
0.4796 22500 0.0185 - -
0.4849 22750 0.0217 - -
0.4903 23000 0.0179 0.0156 0.7937
0.4956 23250 0.0193 - -
0.5009 23500 0.015 - -
0.5062 23750 0.0181 - -
0.5116 24000 0.0173 0.0150 0.7924
0.5169 24250 0.0177 - -
0.5222 24500 0.0183 - -
0.5276 24750 0.0171 - -
0.5329 25000 0.0185 0.0140 0.7955
0.5382 25250 0.0178 - -
0.5435 25500 0.015 - -
0.5489 25750 0.017 - -
0.5542 26000 0.0171 0.0139 0.7931
0.5595 26250 0.0164 - -
0.5649 26500 0.0175 - -
0.5702 26750 0.0175 - -
0.5755 27000 0.0163 0.0133 0.7954
0.5809 27250 0.0179 - -
0.5862 27500 0.016 - -
0.5915 27750 0.0155 - -
0.5968 28000 0.0162 0.0138 0.7979
0.6022 28250 0.0164 - -
0.6075 28500 0.0148 - -
0.6128 28750 0.0152 - -
0.6182 29000 0.0166 0.0134 0.7987
0.6235 29250 0.0159 - -
0.6288 29500 0.0168 - -
0.6341 29750 0.0187 - -
0.6395 30000 0.017 0.0137 0.7980
0.6448 30250 0.0168 - -
0.6501 30500 0.0149 - -
0.6555 30750 0.0159 - -
0.6608 31000 0.0149 0.0131 0.8017
0.6661 31250 0.0149 - -
0.6714 31500 0.0147 - -
0.6768 31750 0.0157 - -
0.6821 32000 0.0151 0.0125 0.8011
0.6874 32250 0.015 - -
0.6928 32500 0.0157 - -
0.6981 32750 0.0153 - -
0.7034 33000 0.0141 0.0123 0.8012
0.7087 33250 0.0143 - -
0.7141 33500 0.0121 - -
0.7194 33750 0.0164 - -
0.7247 34000 0.014 0.0121 0.8014
0.7301 34250 0.0147 - -
0.7354 34500 0.0149 - -
0.7407 34750 0.014 - -
0.7460 35000 0.0156 0.0117 0.8022
0.7514 35250 0.0153 - -
0.7567 35500 0.0146 - -
0.7620 35750 0.0144 - -
0.7674 36000 0.0139 0.0111 0.8035
0.7727 36250 0.0134 - -
0.7780 36500 0.013 - -
0.7833 36750 0.0156 - -
0.7887 37000 0.0144 0.0108 0.8048
0.7940 37250 0.0133 - -
0.7993 37500 0.0154 - -
0.8047 37750 0.0132 - -
0.8100 38000 0.013 0.0108 0.8063
0.8153 38250 0.0126 - -
0.8207 38500 0.0135 - -
0.8260 38750 0.014 - -
0.8313 39000 0.013 0.0109 0.8086
0.8366 39250 0.0136 - -
0.8420 39500 0.0141 - -
0.8473 39750 0.0155 - -
0.8526 40000 0.0153 0.0106 0.8075
0.8580 40250 0.0131 - -
0.8633 40500 0.0128 - -
0.8686 40750 0.013 - -
0.8739 41000 0.0133 0.0109 0.8060
0.8793 41250 0.0119 - -
0.8846 41500 0.0144 - -
0.8899 41750 0.0142 - -
0.8953 42000 0.0138 0.0105 0.8083
0.9006 42250 0.014 - -
0.9059 42500 0.0134 - -
0.9112 42750 0.0134 - -
0.9166 43000 0.0124 0.0106 0.8113
0.9219 43250 0.0122 - -
0.9272 43500 0.0126 - -
0.9326 43750 0.0121 - -
0.9379 44000 0.0137 0.0103 0.8105
0.9432 44250 0.0132 - -
0.9485 44500 0.012 - -
0.9539 44750 0.0136 - -
0.9592 45000 0.0133 0.0104 0.8112
0.9645 45250 0.0118 - -
0.9699 45500 0.0132 - -
0.9752 45750 0.0118 - -
0.9805 46000 0.012 0.0102 0.8104
0.9858 46250 0.0127 - -
0.9912 46500 0.0134 - -
0.9965 46750 0.0121 - -
1.0 46914 - - 0.8111

Environmental Impact

Carbon emissions were measured using CodeCarbon.

  • Energy Consumed: 1.637 kWh
  • Carbon Emitted: 0.636 kg of CO2
  • Hours Used: 4.514 hours

Training Hardware

  • On Cloud: No
  • GPU Model: 1 x NVIDIA GeForce RTX 3090
  • CPU Model: 13th Gen Intel(R) Core(TM) i7-13700K
  • RAM Size: 31.78 GB

Framework Versions

  • Python: 3.11.6
  • Sentence Transformers: 3.1.0.dev0
  • Transformers: 4.41.2
  • PyTorch: 2.3.0+cu121
  • Accelerate: 0.30.1
  • Datasets: 2.19.1
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}