Edit model card

SentenceTransformer based on AI-Growth-Lab/PatentSBERTa

This is a sentence-transformers model finetuned from AI-Growth-Lab/PatentSBERTa. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: AI-Growth-Lab/PatentSBERTa
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'The air compressor device according to claim 4 with two opposite side wings formed on the pole body and extending between the front end and the flange of the pole body along the axis defining two opposite wide faces and two opposite narrow faces with a area smaller than that of the wide faces with two recessions respectively formed in the side wings adjacent to the front end of the pole body with the engaging ring mounted in the recessions with the engaging ring including front and rear flanges spaced along the axis and an annular engaging groove formed between the front and rear flanges with a buffer ring mounted in the engaging groove of the engaging ring and being in contact with the inner wall of the second end of the axial hole.',
    'Accordingly the sealing and inflating assembly includes an air compressing device including an improved tire repairing container for quickly coupling and attaching and securing to an outlet tube of the air compressor and for quickly disengaging from the air compressor and for allowing the tire sealing preparation to be effectively supplied to seal and inflate the inflatable objects and for easily and quickly and changeably attaching and securing to the outlet tube of the air compressor.',
    'Thus since the path delay from interface 102 to unit 2 is known to be 1 and interface 102 is on the same unit as interface 101 and the path delay between interface 101 and unit 0 is known 1 the path delay between interface 001 and unit 2 can be entered as 112.The path delay table then becomes as shown in Table 3 immediately below.Table 3Dest Unit IdUnit 0 1 2 3 40 IF 001 0 1 2 255 2550 IF 002 0 255 2 1 21 IF 101 1 0 255 2 2551 IF 102 255 0 1 2 22 IF 201 2 1 0 255 2552 IF 202 255 255 0 2 12 IF 203 2 255 0 1 23 IF 301 1 2 255 0 2553 IF 302 255 2 1 0 23 IF 303 255 255 2 0 14 IF 401 255 2 1 2 04 IF 402 2 255 2 1 0',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Binary Classification

Metric Value
cosine_accuracy 1.0
cosine_accuracy_threshold -0.0896
cosine_f1 1.0
cosine_f1_threshold -0.0896
cosine_precision 1.0
cosine_recall 1.0
cosine_ap 1.0
dot_accuracy 1.0
dot_accuracy_threshold -3.4987
dot_f1 1.0
dot_f1_threshold -3.4987
dot_precision 1.0
dot_recall 1.0
dot_ap 1.0
manhattan_accuracy 1.0
manhattan_accuracy_threshold 203.8069
manhattan_f1 1.0
manhattan_f1_threshold 203.8069
manhattan_precision 1.0
manhattan_recall 1.0
manhattan_ap 1.0
euclidean_accuracy 1.0
euclidean_accuracy_threshold 9.311
euclidean_f1 1.0
euclidean_f1_threshold 9.311
euclidean_precision 1.0
euclidean_recall 1.0
euclidean_ap 1.0
max_accuracy 1.0
max_accuracy_threshold 203.8069
max_f1 1.0
max_f1_threshold 203.8069
max_precision 1.0
max_recall 1.0
max_ap 1.0

Training Details

Training Dataset

Unnamed Dataset

  • Size: 2,545,432 training samples
  • Columns: text_a, text_b, and label
  • Approximate statistics based on the first 1000 samples:
    text_a text_b label
    type string string float
    details
    • min: 14 tokens
    • mean: 75.3 tokens
    • max: 512 tokens
    • min: 8 tokens
    • mean: 117.82 tokens
    • max: 512 tokens
    • min: 1.0
    • mean: 1.0
    • max: 1.0
  • Samples:
    text_a text_b label
    A method for displaying a photo the method being performed by a display apparatus and comprising transmitting identification information of the display apparatus to a first portable apparatus and a second portable apparatus connecting through an access point to the first portable apparatus which receives the identification information of the display apparatus connecting through the access point to the second portable apparatus which receives the identification information of the display apparatus in response to a first photo being received from the first portable apparatus displaying the first photo on a display of the display apparatus and in response to a second photo being received from the second portable apparatus displaying the second photo on the display on which the first photo is displayed wherein the displaying the second photo comprises displaying the first photo and the second photo all together in different sizes. Further applications may also be loaded onto the electronic device 902 through for example the wireless network 904 an auxiliary IO device 922 USB port 924 shortrange communications subsystem 936 or any combination of these interfaces.Such applications are then able to be installed by a user in the RAM 920 or a nonvolatile store for execution by the microprocessor 916. 1.0
    The method of any one of claims 8 to 9 wherein the high energydensity welding process uses both a laser and GMAW welding and wherein the GMAW welding is selected from the group comprising MIG welding and MAG welding. Indicated at 27 is a wide platelike material employed as a base or starting material in the fabrication of thesquare tubular structure 22.As shown in Figs.3 to 5 this wide platelike material 27 is composed of a longitudinally extending flat thin plates 28 29 thick corner plates 30 and thick plates 31 which are joined together side to side by butt welding.More particularly for example these plate materials are prejoined by high energy density welding like laser welding capable of deep penetration. 1.0
    An apparatus of a server in a communication system comprising a transceiver configured to communicating with at least one of a network node and an external server and a controller configured to receive identifier information of a user equipment UE obtain information on a second authentication key for authenticating the UE if an error is detected for a first authentication key corresponding to the identifier information and authenticate the UE based on the information on the second authentication key. Figure 2 illustrates an exemplary block diagram of an apparatus providing an access point node functionality.For the sake of clarity the apparatus or a corresponding component is called herein as an access point node.The access point node 200 is a computing device comprising not only prior art means but also means for implementing access point node functionality described with an embodiment and it may comprise separate means for each separate function or means may be configured to perform two or more functions and even to combine functions of different embodiments.These means may be implemented by various techniques.For example the means may be implemented in hardware one or more apparatuses firmware one or more apparatuses software one or more modules or combinations thereof.For afirmware or software implementation can be through unitsmodules e.g.procedures functions and so on that perform the functions described herein. 1.0
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 133,971 evaluation samples
  • Columns: text_a, text_b, and label
  • Approximate statistics based on the first 1000 samples:
    text_a text_b label
    type string string float
    details
    • min: 14 tokens
    • mean: 73.52 tokens
    • max: 512 tokens
    • min: 7 tokens
    • mean: 113.72 tokens
    • max: 512 tokens
    • min: 1.0
    • mean: 1.0
    • max: 1.0
  • Samples:
    text_a text_b label
    The display apparatus according to claim 3 wherein the curvature of the display panel in the second state is gradually reduced with increasing distance from the center portion or decreasing distance to the edge portion in the second direction. Referring to FIG.5 in the shape deformingmaintaining unit 120 an actuator 123 is arranged between a first electrode 121 and a second electrode 122.The first electrode 121 is divided into a plurality of parts that are arranged on a surface of the actuator 123 in parallel and are set apart by a distance for example a constant distance whereas the second electrode 122 is arranged completely on the other surface of the actuator 123.When a circuit unit 124 applies voltages for example predetermined voltages V1 V2 and V3 to each of the divided parts of the first electrode 121 and a voltage for example a predetermined voltage to the second electrode 122 under control of a control unit see FIG.8 the actuator 123 changes shape in response to the applied voltages and thus the curvature of the flexible display panel 110 arranged on a surface of the shape deformingmaintaining unit 120 may be controlled. 1.0
    The organic light emitting display device 10 of claim 4 wherein the shared node Ns has ashape ashape ashape or ashape. The first transistor T1 is a transistor that is controlled by the sensing signal SENSE supplied by the first gate line GL1 is connected between the reference voltage line RVL supplying the reference voltage Vref or a connection pattern CP connected to the reference voltage line and the first node N1 of the driving transistor DT and is concerned in a sensing mode and is also referred to as a sensor transistor. 1.0
    Measuring machine 10 according to claim 1 or 2 characterized in thatthe generated laser beam 15 with its oval 16 or linelike 16 crosssection is adjusted with the length of its crosssection 16 16 orthogonal to the triangulation baseline b. According to the present exemplary embodiment a oneshot active type range sensor is used.Such a range sensor irradiates the object with multislit lines to which color identifications ID of different wavelength are attached.The range sensor then captures reflected light using the camera and measures the range employing triangulation. 1.0
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • learning_rate: 2e-05
  • weight_decay: 0.01
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • load_best_model_at_end: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.01
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss loss max_ap
0.0013 100 2.2516 - -
0.0025 200 2.0871 - -
0.0038 300 1.8806 - -
0.0050 400 1.6918 - -
0.0063 500 1.6214 - -
0.0075 600 1.6796 - -
0.0088 700 1.6149 - -
0.0101 800 1.6178 - -
0.0113 900 1.5865 - -
0.0126 1000 1.4899 - -
0.0138 1100 1.4679 - -
0.0151 1200 1.4484 - -
0.0163 1300 1.4421 - -
0.0176 1400 1.4058 - -
0.0189 1500 1.4092 - -
0.0201 1600 1.3989 - -
0.0214 1700 1.3503 - -
0.0226 1800 1.3746 - -
0.0239 1900 1.3267 - -
0.0251 2000 1.2934 - -
0.0264 2100 1.3097 - -
0.0277 2200 1.2584 - -
0.0289 2300 1.2713 - -
0.0302 2400 1.2769 - -
0.0314 2500 1.2311 - -
0.0327 2600 1.2591 - -
0.0339 2700 1.2431 - -
0.0352 2800 1.2325 - -
0.0365 2900 1.2377 - -
0.0377 3000 1.1868 - -
0.0390 3100 1.1844 - -
0.0402 3200 1.1939 - -
0.0415 3300 1.2116 - -
0.0427 3400 1.1737 - -
0.0440 3500 1.1517 - -
0.0453 3600 1.1291 - -
0.0465 3700 1.1196 - -
0.0478 3800 1.1149 - -
0.0490 3900 1.1194 - -
0.0503 4000 1.0612 - -
0.0515 4100 1.0918 - -
0.0528 4200 1.0629 - -
0.0541 4300 1.0455 - -
0.0553 4400 1.0625 - -
0.0566 4500 1.0586 - -
0.0578 4600 1.0057 - -
0.0591 4700 1.0269 - -
0.0603 4800 1.0543 - -
0.0616 4900 1.0117 - -
0.0629 5000 1.0412 - -
0.0641 5100 0.9773 - -
0.0654 5200 0.9889 - -
0.0666 5300 0.9499 - -
0.0679 5400 0.9528 - -
0.0691 5500 0.9468 - -
0.0704 5600 0.9202 - -
0.0717 5700 0.9896 - -
0.0729 5800 0.9795 - -
0.0742 5900 0.9582 - -
0.0754 6000 0.9254 - -
0.0767 6100 0.9476 - -
0.0779 6200 0.9157 - -
0.0792 6300 0.9221 - -
0.0805 6400 0.8769 - -
0.0817 6500 0.8659 - -
0.0830 6600 0.9024 - -
0.0842 6700 0.8898 - -
0.0855 6800 0.8952 - -
0.0867 6900 0.8773 - -
0.0880 7000 0.8676 - -
0.0893 7100 0.8263 - -
0.0905 7200 0.8782 - -
0.0918 7300 0.8295 - -
0.0930 7400 0.873 - -
0.0943 7500 0.8292 - -
0.0955 7600 0.8301 - -
0.0968 7700 0.858 - -
0.0981 7800 0.8326 - -
0.0993 7900 0.8334 - -
0.1006 8000 0.8078 - -
0.1018 8100 0.8069 - -
0.1031 8200 0.8206 - -
0.1043 8300 0.7977 - -
0.1056 8400 0.7886 - -
0.1069 8500 0.8012 - -
0.1081 8600 0.7752 - -
0.1094 8700 0.7324 - -
0.1106 8800 0.7507 - -
0.1119 8900 0.7568 - -
0.1131 9000 0.7436 - -
0.1144 9100 0.761 - -
0.1157 9200 0.7374 - -
0.1169 9300 0.7127 - -
0.1182 9400 0.7132 - -
0.1194 9500 0.7713 - -
0.1207 9600 0.7215 - -
0.1219 9700 0.7415 - -
0.1232 9800 0.716 - -
0.1245 9900 0.7365 - -
0.1257 10000 0.7232 0.6556 1.0000
0.1270 10100 0.7121 - -
0.1282 10200 0.7019 - -
0.1295 10300 0.7208 - -
0.1307 10400 0.6817 - -
0.1320 10500 0.6886 - -
0.1333 10600 0.7182 - -
0.1345 10700 0.698 - -
0.1358 10800 0.7276 - -
0.1370 10900 0.6836 - -
0.1383 11000 0.6855 - -
0.1395 11100 0.6702 - -
0.1408 11200 0.6851 - -
0.1421 11300 0.6107 - -
0.1433 11400 0.6615 - -
0.1446 11500 0.6703 - -
0.1458 11600 0.6318 - -
0.1471 11700 0.6277 - -
0.1483 11800 0.6321 - -
0.1496 11900 0.6411 - -
0.1509 12000 0.6231 - -
0.1521 12100 0.6163 - -
0.1534 12200 0.6087 - -
0.1546 12300 0.6068 - -
0.1559 12400 0.6223 - -
0.1571 12500 0.6165 - -
0.1584 12600 0.6345 - -
0.1597 12700 0.6153 - -
0.1609 12800 0.5805 - -
0.1622 12900 0.5997 - -
0.1634 13000 0.6038 - -
0.1647 13100 0.6061 - -
0.1659 13200 0.5828 - -
0.1672 13300 0.5964 - -
0.1685 13400 0.5842 - -
0.1697 13500 0.6142 - -
0.1710 13600 0.5633 - -
0.1722 13700 0.5828 - -
0.1735 13800 0.5728 - -
0.1747 13900 0.5335 - -
0.1760 14000 0.5474 - -
0.1773 14100 0.5593 - -
0.1785 14200 0.5765 - -
0.1798 14300 0.565 - -
0.1810 14400 0.5969 - -
0.1823 14500 0.5569 - -
0.1835 14600 0.5344 - -
0.1848 14700 0.5891 - -
0.1861 14800 0.5743 - -
0.1873 14900 0.5224 - -
0.1886 15000 0.5168 - -
0.1898 15100 0.5408 - -
0.1911 15200 0.5385 - -
0.1923 15300 0.5284 - -
0.1936 15400 0.5415 - -
0.1949 15500 0.5191 - -
0.1961 15600 0.4924 - -
0.1974 15700 0.527 - -
0.1986 15800 0.5164 - -
0.1999 15900 0.5011 - -
0.2011 16000 0.5328 - -
0.2024 16100 0.535 - -
0.2037 16200 0.5348 - -
0.2049 16300 0.499 - -
0.2062 16400 0.5208 - -
0.2074 16500 0.4906 - -
0.2087 16600 0.4906 - -
0.2099 16700 0.473 - -
0.2112 16800 0.4769 - -
0.2125 16900 0.5211 - -
0.2137 17000 0.5148 - -
0.2150 17100 0.5263 - -
0.2162 17200 0.5052 - -
0.2175 17300 0.4957 - -
0.2187 17400 0.4649 - -
0.2200 17500 0.4991 - -
0.2213 17600 0.4833 - -
0.2225 17700 0.4675 - -
0.2238 17800 0.4643 - -
0.2250 17900 0.4769 - -
0.2263 18000 0.4652 - -
0.2275 18100 0.4372 - -
0.2288 18200 0.4966 - -
0.2301 18300 0.4431 - -
0.2313 18400 0.4685 - -
0.2326 18500 0.4708 - -
0.2338 18600 0.4846 - -
0.2351 18700 0.4399 - -
0.2363 18800 0.4382 - -
0.2376 18900 0.4746 - -
0.2389 19000 0.4654 - -
0.2401 19100 0.4834 - -
0.2414 19200 0.4456 - -
0.2426 19300 0.4778 - -
0.2439 19400 0.4367 - -
0.2451 19500 0.4287 - -
0.2464 19600 0.4494 - -
0.2477 19700 0.4381 - -
0.2489 19800 0.4593 - -
0.2502 19900 0.4304 - -
0.2514 20000 0.4415 0.3788 1.0000
0.2527 20100 0.4636 - -
0.2539 20200 0.4549 - -
0.2552 20300 0.4422 - -
0.2565 20400 0.4791 - -
0.2577 20500 0.4369 - -
0.2590 20600 0.4386 - -
0.2602 20700 0.4481 - -
0.2615 20800 0.4132 - -
0.2627 20900 0.4408 - -
0.2640 21000 0.4138 - -
0.2653 21100 0.415 - -
0.2665 21200 0.4444 - -
0.2678 21300 0.3727 - -
0.2690 21400 0.4074 - -
0.2703 21500 0.4292 - -
0.2715 21600 0.4366 - -
0.2728 21700 0.4325 - -
0.2741 21800 0.448 - -
0.2753 21900 0.4308 - -
0.2766 22000 0.392 - -
0.2778 22100 0.3922 - -
0.2791 22200 0.44 - -
0.2803 22300 0.4091 - -
0.2816 22400 0.4219 - -
0.2829 22500 0.4235 - -
0.2841 22600 0.3979 - -
0.2854 22700 0.4073 - -
0.2866 22800 0.3857 - -
0.2879 22900 0.4143 - -
0.2891 23000 0.3682 - -
0.2904 23100 0.4224 - -
0.2917 23200 0.3771 - -
0.2929 23300 0.3845 - -
0.2942 23400 0.4116 - -
0.2954 23500 0.3954 - -
0.2967 23600 0.4193 - -
0.2979 23700 0.3837 - -
0.2992 23800 0.3794 - -
0.3005 23900 0.3788 - -
0.3017 24000 0.372 - -
0.3030 24100 0.4016 - -
0.3042 24200 0.3936 - -
0.3055 24300 0.3957 - -
0.3067 24400 0.3742 - -
0.3080 24500 0.4021 - -
0.3093 24600 0.3876 - -
0.3105 24700 0.3903 - -
0.3118 24800 0.3402 - -
0.3130 24900 0.3606 - -
0.3143 25000 0.3578 - -
0.3155 25100 0.3874 - -
0.3168 25200 0.3545 - -
0.3181 25300 0.3586 - -
0.3193 25400 0.3566 - -
0.3206 25500 0.3552 - -
0.3218 25600 0.3576 - -
0.3231 25700 0.3764 - -
0.3243 25800 0.3779 - -
0.3256 25900 0.3711 - -
0.3269 26000 0.3449 - -
0.3281 26100 0.3715 - -
0.3294 26200 0.3683 - -
0.3306 26300 0.3712 - -
0.3319 26400 0.3393 - -
0.3331 26500 0.3236 - -
0.3344 26600 0.36 - -
0.3357 26700 0.3531 - -
0.3369 26800 0.3711 - -
0.3382 26900 0.3483 - -
0.3394 27000 0.3541 - -
0.3407 27100 0.3455 - -
0.3419 27200 0.3782 - -
0.3432 27300 0.3228 - -
0.3445 27400 0.3561 - -
0.3457 27500 0.3439 - -
0.3470 27600 0.3356 - -
0.3482 27700 0.3348 - -
0.3495 27800 0.3256 - -
0.3507 27900 0.3453 - -
0.3520 28000 0.3563 - -
0.3533 28100 0.354 - -
0.3545 28200 0.3328 - -
0.3558 28300 0.3278 - -
0.3570 28400 0.3693 - -
0.3583 28500 0.3379 - -
0.3595 28600 0.3123 - -
0.3608 28700 0.3599 - -
0.3621 28800 0.3312 - -
0.3633 28900 0.3335 - -
0.3646 29000 0.3189 - -
0.3658 29100 0.3326 - -
0.3671 29200 0.3278 - -
0.3683 29300 0.3408 - -
0.3696 29400 0.3216 - -
0.3709 29500 0.3233 - -
0.3721 29600 0.3192 - -
0.3734 29700 0.3459 - -
0.3746 29800 0.3341 - -
0.3759 29900 0.319 - -
0.3771 30000 0.3091 0.2659 1.0000
0.3784 30100 0.3207 - -
0.3797 30200 0.3114 - -
0.3809 30300 0.3074 - -
0.3822 30400 0.3252 - -
0.3834 30500 0.3058 - -
0.3847 30600 0.3244 - -
0.3859 30700 0.3343 - -
0.3872 30800 0.3279 - -
0.3885 30900 0.2983 - -
0.3897 31000 0.3198 - -
0.3910 31100 0.3102 - -
0.3922 31200 0.3407 - -
0.3935 31300 0.323 - -
0.3947 31400 0.2921 - -
0.3960 31500 0.2928 - -
0.3973 31600 0.309 - -
0.3985 31700 0.2867 - -
0.3998 31800 0.2973 - -
0.4010 31900 0.2889 - -
0.4023 32000 0.3163 - -
0.4035 32100 0.3139 - -
0.4048 32200 0.3081 - -
0.4061 32300 0.3155 - -
0.4073 32400 0.2674 - -
0.4086 32500 0.2979 - -
0.4098 32600 0.2653 - -
0.4111 32700 0.2758 - -
0.4123 32800 0.3062 - -
0.4136 32900 0.2832 - -
0.4149 33000 0.2866 - -
0.4161 33100 0.2925 - -
0.4174 33200 0.2784 - -
0.4186 33300 0.2859 - -
0.4199 33400 0.3033 - -
0.4211 33500 0.2755 - -
0.4224 33600 0.2803 - -
0.4237 33700 0.2968 - -
0.4249 33800 0.2972 - -
0.4262 33900 0.2774 - -
0.4274 34000 0.2859 - -
0.4287 34100 0.2884 - -
0.4299 34200 0.2657 - -
0.4312 34300 0.2839 - -
0.4325 34400 0.2843 - -
0.4337 34500 0.3019 - -
0.4350 34600 0.2755 - -
0.4362 34700 0.3062 - -
0.4375 34800 0.2848 - -
0.4387 34900 0.296 - -
0.4400 35000 0.2859 - -
0.4413 35100 0.2678 - -
0.4425 35200 0.2723 - -
0.4438 35300 0.2761 - -
0.4450 35400 0.3142 - -
0.4463 35500 0.2643 - -
0.4475 35600 0.3025 - -
0.4488 35700 0.254 - -
0.4501 35800 0.2831 - -
0.4513 35900 0.274 - -
0.4526 36000 0.2979 - -
0.4538 36100 0.292 - -
0.4551 36200 0.2674 - -
0.4563 36300 0.274 - -
0.4576 36400 0.2781 - -
0.4589 36500 0.279 - -
0.4601 36600 0.2781 - -
0.4614 36700 0.2512 - -
0.4626 36800 0.2775 - -
0.4639 36900 0.2726 - -
0.4651 37000 0.2495 - -
0.4664 37100 0.2659 - -
0.4677 37200 0.2885 - -
0.4689 37300 0.2613 - -
0.4702 37400 0.2508 - -
0.4714 37500 0.2451 - -
0.4727 37600 0.2644 - -
0.4739 37700 0.2713 - -
0.4752 37800 0.2697 - -
0.4765 37900 0.2661 - -
0.4777 38000 0.2538 - -
0.4790 38100 0.2543 - -
0.4802 38200 0.2545 - -
0.4815 38300 0.2507 - -
0.4827 38400 0.2698 - -
0.4840 38500 0.2475 - -
0.4853 38600 0.2579 - -
0.4865 38700 0.2483 - -
0.4878 38800 0.2387 - -
0.4890 38900 0.2689 - -
0.4903 39000 0.2544 - -
0.4915 39100 0.258 - -
0.4928 39200 0.2519 - -
0.4941 39300 0.2493 - -
0.4953 39400 0.2465 - -
0.4966 39500 0.2424 - -
0.4978 39600 0.2567 - -
0.4991 39700 0.2399 - -
0.5003 39800 0.2291 - -
0.5016 39900 0.2715 - -
0.5029 40000 0.239 0.1931 1.0000
0.5041 40100 0.2294 - -
0.5054 40200 0.2457 - -
0.5066 40300 0.2481 - -
0.5079 40400 0.2348 - -
0.5091 40500 0.2378 - -
0.5104 40600 0.2486 - -
0.5117 40700 0.2374 - -
0.5129 40800 0.2498 - -
0.5142 40900 0.222 - -
0.5154 41000 0.2321 - -
0.5167 41100 0.2435 - -
0.5179 41200 0.2383 - -
0.5192 41300 0.2484 - -
0.5205 41400 0.2437 - -
0.5217 41500 0.2326 - -
0.5230 41600 0.2635 - -
0.5242 41700 0.2524 - -
0.5255 41800 0.239 - -
0.5267 41900 0.2532 - -
0.5280 42000 0.2279 - -
0.5293 42100 0.2336 - -
0.5305 42200 0.2388 - -
0.5318 42300 0.241 - -
0.5330 42400 0.2164 - -
0.5343 42500 0.2369 - -
0.5355 42600 0.2352 - -
0.5368 42700 0.2678 - -
0.5381 42800 0.221 - -
0.5393 42900 0.2312 - -
0.5406 43000 0.2272 - -
0.5418 43100 0.2306 - -
0.5431 43200 0.2334 - -
0.5443 43300 0.2287 - -
0.5456 43400 0.2258 - -
0.5469 43500 0.2179 - -
0.5481 43600 0.1964 - -
0.5494 43700 0.2178 - -
0.5506 43800 0.2339 - -
0.5519 43900 0.2349 - -
0.5531 44000 0.2141 - -
0.5544 44100 0.2324 - -
0.5557 44200 0.2422 - -
0.5569 44300 0.2281 - -
0.5582 44400 0.2359 - -
0.5594 44500 0.1989 - -
0.5607 44600 0.2168 - -
0.5619 44700 0.2332 - -
0.5632 44800 0.2121 - -
0.5645 44900 0.2216 - -
0.5657 45000 0.2059 - -
0.5670 45100 0.2392 - -
0.5682 45200 0.2358 - -
0.5695 45300 0.2438 - -
0.5707 45400 0.2115 - -
0.5720 45500 0.2231 - -
0.5733 45600 0.2294 - -
0.5745 45700 0.2172 - -
0.5758 45800 0.2154 - -
0.5770 45900 0.233 - -
0.5783 46000 0.2369 - -
0.5795 46100 0.2186 - -
0.5808 46200 0.2125 - -
0.5821 46300 0.2098 - -
0.5833 46400 0.2263 - -
0.5846 46500 0.2264 - -
0.5858 46600 0.2132 - -
0.5871 46700 0.2283 - -
0.5883 46800 0.2115 - -
0.5896 46900 0.202 - -
0.5909 47000 0.1838 - -
0.5921 47100 0.21 - -
0.5934 47200 0.2137 - -
0.5946 47300 0.1944 - -
0.5959 47400 0.2158 - -
0.5971 47500 0.2004 - -
0.5984 47600 0.1946 - -
0.5997 47700 0.1893 - -
0.6009 47800 0.191 - -
0.6022 47900 0.2018 - -
0.6034 48000 0.1847 - -
0.6047 48100 0.1804 - -
0.6059 48200 0.2164 - -
0.6072 48300 0.1923 - -
0.6085 48400 0.2005 - -
0.6097 48500 0.2005 - -
0.6110 48600 0.1935 - -
0.6122 48700 0.1834 - -
0.6135 48800 0.2091 - -
0.6147 48900 0.1928 - -
0.6160 49000 0.1989 - -
0.6173 49100 0.1953 - -
0.6185 49200 0.2105 - -
0.6198 49300 0.2004 - -
0.6210 49400 0.1817 - -
0.6223 49500 0.2062 - -
0.6235 49600 0.216 - -
0.6248 49700 0.209 - -
0.6261 49800 0.1902 - -
0.6273 49900 0.212 - -
0.6286 50000 0.2267 0.1532 1.0
0.6298 50100 0.2134 - -
0.6311 50200 0.2255 - -
0.6323 50300 0.2125 - -
0.6336 50400 0.2113 - -
0.6349 50500 0.2082 - -
0.6361 50600 0.1949 - -
0.6374 50700 0.2024 - -
0.6386 50800 0.2228 - -
0.6399 50900 0.2057 - -
0.6411 51000 0.1882 - -
0.6424 51100 0.2015 - -
0.6437 51200 0.207 - -
0.6449 51300 0.1934 - -
0.6462 51400 0.1743 - -
0.6474 51500 0.2046 - -
0.6487 51600 0.2059 - -
0.6499 51700 0.1878 - -
0.6512 51800 0.2189 - -
0.6525 51900 0.1949 - -
0.6537 52000 0.1982 - -
0.6550 52100 0.1746 - -
0.6562 52200 0.1887 - -
0.6575 52300 0.1856 - -
0.6587 52400 0.1955 - -
0.6600 52500 0.1937 - -
0.6613 52600 0.1912 - -
0.6625 52700 0.1987 - -
0.6638 52800 0.201 - -
0.6650 52900 0.1955 - -
0.6663 53000 0.1995 - -
0.6675 53100 0.1829 - -
0.6688 53200 0.1836 - -
0.6701 53300 0.1972 - -
0.6713 53400 0.2064 - -
0.6726 53500 0.1966 - -
0.6738 53600 0.191 - -
0.6751 53700 0.1834 - -
0.6763 53800 0.1976 - -
0.6776 53900 0.2128 - -
0.6789 54000 0.1948 - -
0.6801 54100 0.1801 - -
0.6814 54200 0.1904 - -
0.6826 54300 0.1979 - -
0.6839 54400 0.1916 - -
0.6851 54500 0.1963 - -
0.6864 54600 0.1809 - -
0.6877 54700 0.2013 - -
0.6889 54800 0.1796 - -
0.6902 54900 0.1855 - -
0.6914 55000 0.1742 - -
0.6927 55100 0.1937 - -
0.6939 55200 0.1819 - -
0.6952 55300 0.1909 - -
0.6965 55400 0.1737 - -
0.6977 55500 0.1882 - -
0.6990 55600 0.1733 - -
0.7002 55700 0.1672 - -
0.7015 55800 0.1879 - -
0.7027 55900 0.1705 - -
0.7040 56000 0.1928 - -
0.7053 56100 0.1834 - -
0.7065 56200 0.2112 - -
0.7078 56300 0.1718 - -
0.7090 56400 0.1767 - -
0.7103 56500 0.1932 - -
0.7115 56600 0.1767 - -
0.7128 56700 0.1769 - -
0.7141 56800 0.1656 - -
0.7153 56900 0.1795 - -
0.7166 57000 0.1869 - -
0.7178 57100 0.1868 - -
0.7191 57200 0.1754 - -
0.7203 57300 0.1986 - -
0.7216 57400 0.185 - -
0.7229 57500 0.1752 - -
0.7241 57600 0.174 - -
0.7254 57700 0.1758 - -
0.7266 57800 0.1751 - -
0.7279 57900 0.1766 - -
0.7291 58000 0.1659 - -
0.7304 58100 0.1752 - -
0.7317 58200 0.1786 - -
0.7329 58300 0.1798 - -
0.7342 58400 0.187 - -
0.7354 58500 0.1894 - -
0.7367 58600 0.1825 - -
0.7379 58700 0.1734 - -
0.7392 58800 0.1736 - -
0.7405 58900 0.1721 - -
0.7417 59000 0.1789 - -
0.7430 59100 0.1737 - -
0.7442 59200 0.1795 - -
0.7455 59300 0.1723 - -
0.7467 59400 0.1608 - -
0.7480 59500 0.1637 - -
0.7493 59600 0.1724 - -
0.7505 59700 0.1684 - -
0.7518 59800 0.1708 - -
0.7530 59900 0.1578 - -
0.7543 60000 0.1761 0.1251 1.0

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.1.0+cu118
  • Accelerate: 0.31.0
  • Datasets: 2.20.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
121
Safetensors
Model size
109M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for engineai/immensa_embeddings

Finetuned
this model

Evaluation results