Edit model card

SentenceTransformer based on google-bert/bert-base-uncased

This is a sentence-transformers model finetuned from google-bert/bert-base-uncased on the all-nli-pair, all-nli-pair-class, all-nli-pair-score, all-nli-triplet, stsb, quora and natural-questions datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("kh-li/bert-base-all-nli-stsb-quora-nq")
# Run inference
sentences = [
    'There is a very full description of the various types of hormone rooting compound here.',
    'It is meant to stimulate root growth - in particular to stimulate the creation of roots.',
    "The least that can be said is that we must be born with the ability and 'knowledge' to learn.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Datasets

all-nli-pair

  • Dataset: all-nli-pair at d482672
  • Size: 10,000 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 5 tokens
    • mean: 17.03 tokens
    • max: 64 tokens
    • min: 4 tokens
    • mean: 9.62 tokens
    • max: 31 tokens
  • Samples:
    anchor positive
    A person on a horse jumps over a broken down airplane. A person is outdoors, on a horse.
    Children smiling and waving at camera There are children present
    A boy is jumping on skateboard in the middle of a red bridge. The boy does a skateboarding trick.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

all-nli-pair-class

  • Dataset: all-nli-pair-class at d482672
  • Size: 10,000 training samples
  • Columns: premise, hypothesis, and label
  • Approximate statistics based on the first 1000 samples:
    premise hypothesis label
    type string string int
    details
    • min: 6 tokens
    • mean: 17.38 tokens
    • max: 52 tokens
    • min: 4 tokens
    • mean: 10.7 tokens
    • max: 31 tokens
    • 0: ~33.40%
    • 1: ~33.30%
    • 2: ~33.30%
  • Samples:
    premise hypothesis label
    A person on a horse jumps over a broken down airplane. A person is training his horse for a competition. 1
    A person on a horse jumps over a broken down airplane. A person is at a diner, ordering an omelette. 2
    A person on a horse jumps over a broken down airplane. A person is outdoors, on a horse. 0
  • Loss: ContrastiveLoss with these parameters:
    {
        "distance_metric": "SiameseDistanceMetric.COSINE_DISTANCE",
        "margin": 0.5,
        "size_average": true
    }
    

all-nli-pair-score

  • Dataset: all-nli-pair-score at d482672
  • Size: 10,000 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 6 tokens
    • mean: 17.38 tokens
    • max: 52 tokens
    • min: 4 tokens
    • mean: 10.7 tokens
    • max: 31 tokens
    • min: 0.0
    • mean: 0.5
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    A person on a horse jumps over a broken down airplane. A person is training his horse for a competition. 0.5
    A person on a horse jumps over a broken down airplane. A person is at a diner, ordering an omelette. 0.0
    A person on a horse jumps over a broken down airplane. A person is outdoors, on a horse. 1.0
  • Loss: CosineSimilarityLoss with these parameters:
    {
        "loss_fct": "torch.nn.modules.loss.MSELoss"
    }
    

all-nli-triplet

  • Dataset: all-nli-triplet at d482672
  • Size: 10,000 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 10.46 tokens
    • max: 46 tokens
    • min: 6 tokens
    • mean: 12.81 tokens
    • max: 40 tokens
    • min: 5 tokens
    • mean: 13.4 tokens
    • max: 50 tokens
  • Samples:
    anchor positive negative
    A person on a horse jumps over a broken down airplane. A person is outdoors, on a horse. A person is at a diner, ordering an omelette.
    Children smiling and waving at camera There are children present The kids are frowning
    A boy is jumping on skateboard in the middle of a red bridge. The boy does a skateboarding trick. The boy skates down the sidewalk.
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.EUCLIDEAN",
        "triplet_margin": 5
    }
    

stsb

  • Dataset: stsb at ab7a5ac
  • Size: 5,749 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 6 tokens
    • mean: 10.0 tokens
    • max: 28 tokens
    • min: 5 tokens
    • mean: 9.95 tokens
    • max: 25 tokens
    • min: 0.0
    • mean: 0.54
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    A plane is taking off. An air plane is taking off. 1.0
    A man is playing a large flute. A man is playing a flute. 0.76
    A man is spreading shreded cheese on a pizza. A man is spreading shredded cheese on an uncooked pizza. 0.76
  • Loss: CosineSimilarityLoss with these parameters:
    {
        "loss_fct": "torch.nn.modules.loss.MSELoss"
    }
    

quora

  • Dataset: quora at 451a485
  • Size: 10,000 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 6 tokens
    • mean: 13.92 tokens
    • max: 42 tokens
    • min: 6 tokens
    • mean: 14.09 tokens
    • max: 43 tokens
  • Samples:
    anchor positive
    Astrology: I am a Capricorn Sun Cap moon and cap rising...what does that say about me? I'm a triple Capricorn (Sun, Moon and ascendant in Capricorn) What does this say about me?
    How can I be a good geologist? What should I do to be a great geologist?
    How do I read and find my YouTube comments? How can I see all my Youtube comments?
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

natural-questions

  • Dataset: natural-questions at f9e894e
  • Size: 10,000 training samples
  • Columns: query and answer
  • Approximate statistics based on the first 1000 samples:
    query answer
    type string string
    details
    • min: 10 tokens
    • mean: 11.74 tokens
    • max: 21 tokens
    • min: 17 tokens
    • mean: 135.66 tokens
    • max: 512 tokens
  • Samples:
    query answer
    when did richmond last play in a preliminary final Richmond Football Club Richmond began 2017 with 5 straight wins, a feat it had not achieved since 1995. A series of close losses hampered the Tigers throughout the middle of the season, including a 5-point loss to the Western Bulldogs, 2-point loss to Fremantle, and a 3-point loss to the Giants. Richmond ended the season strongly with convincing victories over Fremantle and St Kilda in the final two rounds, elevating the club to 3rd on the ladder. Richmond's first final of the season against the Cats at the MCG attracted a record qualifying final crowd of 95,028; the Tigers won by 51 points. Having advanced to the first preliminary finals for the first time since 2001, Richmond defeated Greater Western Sydney by 36 points in front of a crowd of 94,258 to progress to the Grand Final against Adelaide, their first Grand Final appearance since 1982. The attendance was 100,021, the largest crowd to a grand final since 1986. The Crows led at quarter time and led by as many as 13, but the Tigers took over the game as it progressed and scored seven straight goals at one point. They eventually would win by 48 points – 16.12 (108) to Adelaide's 8.12 (60) – to end their 37-year flag drought.[22] Dustin Martin also became the first player to win a Premiership medal, the Brownlow Medal and the Norm Smith Medal in the same season, while Damien Hardwick was named AFL Coaches Association Coach of the Year. Richmond's jump from 13th to premiers also marked the biggest jump from one AFL season to the next.
    who sang what in the world's come over you Jack Scott (singer) At the beginning of 1960, Scott again changed record labels, this time to Top Rank Records.[1] He then recorded four Billboard Hot 100 hits â€“ "What in the World's Come Over You" (#5), "Burning Bridges" (#3) b/w "Oh Little One" (#34), and "It Only Happened Yesterday" (#38).[1] "What in the World's Come Over You" was Scott's second gold disc winner.[6] Scott continued to record and perform during the 1960s and 1970s.[1] His song "You're Just Gettin' Better" reached the country charts in 1974.[1] In May 1977, Scott recorded a Peel session for BBC Radio 1 disc jockey, John Peel.
    who produces the most wool in the world Wool Global wool production is about 2 million tonnes per year, of which 60% goes into apparel. Wool comprises ca 3% of the global textile market, but its value is higher owing to dying and other modifications of the material.[1] Australia is a leading producer of wool which is mostly from Merino sheep but has been eclipsed by China in terms of total weight.[30] New Zealand (2016) is the third-largest producer of wool, and the largest producer of crossbred wool. Breeds such as Lincoln, Romney, Drysdale, and Elliotdale produce coarser fibers, and wool from these sheep is usually used for making carpets.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Datasets

all-nli-triplet

  • Dataset: all-nli-triplet at d482672
  • Size: 6,584 evaluation samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 6 tokens
    • mean: 17.95 tokens
    • max: 63 tokens
    • min: 4 tokens
    • mean: 9.78 tokens
    • max: 29 tokens
    • min: 5 tokens
    • mean: 10.35 tokens
    • max: 29 tokens
  • Samples:
    anchor positive negative
    Two women are embracing while holding to go packages. Two woman are holding packages. The men are fighting outside a deli.
    Two young children in blue jerseys, one with the number 9 and one with the number 2 are standing on wooden steps in a bathroom and washing their hands in a sink. Two kids in numbered jerseys wash their hands. Two kids in jackets walk to school.
    A man selling donuts to a customer during a world exhibition event held in the city of Angeles A man selling donuts to a customer. A woman drinks her coffee in a small cafe.
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.EUCLIDEAN",
        "triplet_margin": 5
    }
    

stsb

  • Dataset: stsb at ab7a5ac
  • Size: 1,500 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 5 tokens
    • mean: 15.1 tokens
    • max: 45 tokens
    • min: 6 tokens
    • mean: 15.11 tokens
    • max: 53 tokens
    • min: 0.0
    • mean: 0.47
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    A man with a hard hat is dancing. A man wearing a hard hat is dancing. 1.0
    A young child is riding a horse. A child is riding a horse. 0.95
    A man is feeding a mouse to a snake. The man is feeding a mouse to the snake. 1.0
  • Loss: CosineSimilarityLoss with these parameters:
    {
        "loss_fct": "torch.nn.modules.loss.MSELoss"
    }
    

quora

  • Dataset: quora at 451a485
  • Size: 1,000 evaluation samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 6 tokens
    • mean: 14.05 tokens
    • max: 70 tokens
    • min: 6 tokens
    • mean: 14.11 tokens
    • max: 49 tokens
  • Samples:
    anchor positive
    What is your New Year resolution? What can be my new year resolution for 2017?
    Should I buy the IPhone 6s or Samsung Galaxy s7? Which is better: the iPhone 6S Plus or the Samsung Galaxy S7 Edge?
    What are the differences between transgression and regression? What is the difference between transgression and regression?
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

natural-questions

  • Dataset: natural-questions at f9e894e
  • Size: 1,000 evaluation samples
  • Columns: query and answer
  • Approximate statistics based on the first 1000 samples:
    query answer
    type string string
    details
    • min: 9 tokens
    • mean: 11.8 tokens
    • max: 21 tokens
    • min: 19 tokens
    • mean: 138.84 tokens
    • max: 512 tokens
  • Samples:
    query answer
    where does the waikato river begin and end Waikato River The Waikato River is the longest river in New Zealand, running for 425 kilometres (264 mi) through the North Island. It rises in the eastern slopes of Mount Ruapehu, joining the Tongariro River system and flowing through Lake Taupo, New Zealand's largest lake. It then drains Taupo at the lake's northeastern edge, creates the Huka Falls, and flows northwest through the Waikato Plains. It empties into the Tasman Sea south of Auckland, at Port Waikato. It gives its name to the Waikato Region that surrounds the Waikato Plains. The present course of the river was largely formed about 17,000 years ago. Contributing factors were climate warming, forest being reestablished in the river headwaters and the deepening, rather than widening, of the existing river channel. The channel was gradually eroded as far up river as Piarere, leaving the old Hinuera channel high and dry.[2] The remains of the old river path can be clearly seen at Hinuera where the cliffs mark the ancient river edges. The river's main tributary is the Waipa River, which has its confluence with the Waikato at Ngaruawahia.
    what type of gas is produced during fermentation Fermentation Fermentation reacts NADH with an endogenous, organic electron acceptor.[1] Usually this is pyruvate formed from sugar through glycolysis. The reaction produces NAD+ and an organic product, typical examples being ethanol, lactic acid, carbon dioxide, and hydrogen gas (H2). However, more exotic compounds can be produced by fermentation, such as butyric acid and acetone. Fermentation products contain chemical energy (they are not fully oxidized), but are considered waste products, since they cannot be metabolized further without the use of oxygen.
    why was star wars episode iv released first Star Wars (film) Star Wars (later retitled Star Wars: Episode IV – A New Hope) is a 1977 American epic space opera film written and directed by George Lucas. It is the first film in the original Star Wars trilogy and the beginning of the Star Wars franchise. Starring Mark Hamill, Harrison Ford, Carrie Fisher, Peter Cushing, Alec Guinness, David Prowse, James Earl Jones, Anthony Daniels, Kenny Baker, and Peter Mayhew, the film's plot focuses on the Rebel Alliance, led by Princess Leia (Fisher), and its attempt to destroy the Galactic Empire's space station, the Death Star. This conflict disrupts the isolated life of farmhand Luke Skywalker (Hamill), who inadvertently acquires two droids that possess stolen architectural plans for the Death Star. When the Empire begins a destructive search for the missing droids, Skywalker accompanies Jedi Master Obi-Wan Kenobi (Guinness) on a mission to return the plans to the Rebel Alliance and rescue Leia from her imprisonment by the Empire.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • learning_rate: 2e-05
  • weight_decay: 0.01

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.01
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss quora loss all-nli-triplet loss natural-questions loss stsb loss
0.0024 10 1.1198 - - - -
0.0049 20 1.8886 - - - -
0.0073 30 0.2303 - - - -
0.0097 40 0.1287 - - - -
0.0122 50 0.4993 - - - -
0.0146 60 0.7388 - - - -
0.0170 70 0.8465 - - - -
0.0195 80 0.8701 - - - -
0.0219 90 0.4349 - - - -
0.0243 100 0.2214 - - - -
0.0268 110 0.1308 - - - -
0.0292 120 0.3163 - - - -
0.0316 130 0.3892 - - - -
0.0341 140 0.2641 - - - -
0.0365 150 0.3359 - - - -
0.0389 160 0.5498 - - - -
0.0414 170 0.2354 - - - -
0.0438 180 0.13 - - - -
0.0462 190 0.2307 - - - -
0.0487 200 0.1271 - - - -
0.0511 210 0.064 - - - -
0.0535 220 0.1842 - - - -
0.0560 230 0.1626 - - - -
0.0584 240 0.1869 - - - -
0.0608 250 0.2147 - - - -
0.0633 260 0.2534 - - - -
0.0657 270 0.1005 - - - -
0.0681 280 0.185 - - - -
0.0706 290 0.1867 - - - -
0.0730 300 0.1905 - - - -
0.0754 310 0.2056 - - - -
0.0779 320 0.2223 - - - -
0.0803 330 0.1499 - - - -
0.0827 340 0.107 - - - -
0.0852 350 0.1481 - - - -
0.0876 360 0.1723 - - - -
0.0900 370 0.2387 - - - -
0.0925 380 0.274 - - - -
0.0949 390 0.1058 - - - -
0.0973 400 0.2053 - - - -
0.0998 410 0.1103 - - - -
0.1022 420 0.1839 - - - -
0.1046 430 0.2341 - - - -
0.1071 440 0.2015 - - - -
0.1095 450 0.1356 - - - -
0.1119 460 0.0793 - - - -
0.1144 470 0.2756 - - - -
0.1168 480 0.0957 - - - -
0.1192 490 0.2549 - - - -
0.1217 500 0.1483 - - - -
0.1241 510 0.2444 - - - -
0.1265 520 0.1665 - - - -
0.1290 530 0.1091 - - - -
0.1314 540 0.1562 - - - -
0.1338 550 0.2385 - - - -
0.1363 560 0.2801 - - - -
0.1387 570 0.2929 - - - -
0.1411 580 0.2027 - - - -
0.1436 590 0.1628 - - - -
0.1460 600 0.1434 - - - -
0.1484 610 0.1009 - - - -
0.1509 620 0.2225 - - - -
0.1533 630 0.1103 - - - -
0.1557 640 0.1945 - - - -
0.1582 650 0.096 - - - -
0.1606 660 0.089 - - - -
0.1630 670 0.1493 - - - -
0.1655 680 0.1297 - - - -
0.1679 690 0.0811 - - - -
0.1703 700 0.1718 - - - -
0.1727 710 0.1139 - - - -
0.1752 720 0.2218 - - - -
0.1776 730 0.1397 - - - -
0.1800 740 0.1163 - - - -
0.1825 750 0.1232 - - - -
0.1849 760 0.1724 - - - -
0.1873 770 0.051 - - - -
0.1898 780 0.1442 - - - -
0.1922 790 0.3022 - - - -
0.1946 800 0.1056 - - - -
0.1971 810 0.1798 - - - -
0.1995 820 0.2234 - - - -
0.2019 830 0.1251 - - - -
0.2044 840 0.2053 - - - -
0.2068 850 0.1332 - - - -
0.2092 860 0.1611 - - - -
0.2117 870 0.0685 - - - -
0.2141 880 0.1434 - - - -
0.2165 890 0.1516 - - - -
0.2190 900 0.1158 - - - -
0.2214 910 0.1235 - - - -
0.2238 920 0.1113 - - - -
0.2263 930 0.2258 - - - -
0.2287 940 0.1003 - - - -
0.2311 950 0.1943 - - - -
0.2336 960 0.1338 - - - -
0.2360 970 0.1892 - - - -
0.2384 980 0.1784 - - - -
0.2409 990 0.1379 - - - -
0.2433 1000 0.1426 - - - -
0.2457 1010 0.1536 - - - -
0.2482 1020 0.118 - - - -
0.2506 1030 0.1463 - - - -
0.2530 1040 0.1821 - - - -
0.2555 1050 0.1829 - - - -
0.2579 1060 0.2086 - - - -
0.2603 1070 0.1066 - - - -
0.2628 1080 0.2072 - - - -
0.2652 1090 0.0754 - - - -
0.2676 1100 0.0863 - - - -
0.2701 1110 0.0821 - - - -
0.2725 1120 0.0978 - - - -
0.2749 1130 0.093 - - - -
0.2774 1140 0.0999 - - - -
0.2798 1150 0.1242 - - - -
0.2822 1160 0.1832 - - - -
0.2847 1170 0.1515 - - - -
0.2871 1180 0.187 - - - -
0.2895 1190 0.1394 - - - -
0.2920 1200 0.1922 - - - -
0.2944 1210 0.1522 - - - -
0.2968 1220 0.2439 - - - -
0.2993 1230 0.0743 - - - -
0.3017 1240 0.101 - - - -
0.3041 1250 0.0736 - - - -
0.3066 1260 0.1892 - - - -
0.3090 1270 0.1031 - - - -
0.3114 1280 0.1348 - - - -
0.3139 1290 0.0839 - - - -
0.3163 1300 0.104 - - - -
0.3187 1310 0.1508 - - - -
0.3212 1320 0.163 - - - -
0.3236 1330 0.1057 - - - -
0.3260 1340 0.0979 - - - -
0.3285 1350 0.1521 - - - -
0.3309 1360 0.0549 - - - -
0.3333 1370 0.1038 - - - -
0.3358 1380 0.1023 - - - -
0.3382 1390 0.0543 - - - -
0.3406 1400 0.1276 - - - -
0.3431 1410 0.0705 - - - -
0.3455 1420 0.1127 - - - -
0.3479 1430 0.0737 - - - -
0.3504 1440 0.066 - - - -
0.3528 1450 0.0864 - - - -
0.3552 1460 0.1299 - - - -
0.3577 1470 0.1171 - - - -
0.3601 1480 0.1578 - - - -
0.3625 1490 0.0774 - - - -
0.3650 1500 0.2007 - - - -
0.3674 1510 0.1538 - - - -
0.3698 1520 0.1343 - - - -
0.3723 1530 0.0861 - - - -
0.3747 1540 0.1305 - - - -
0.3771 1550 0.3199 - - - -
0.3796 1560 0.0887 - - - -
0.3820 1570 0.1275 - - - -
0.3844 1580 0.1526 - - - -
0.3869 1590 0.1412 - - - -
0.3893 1600 0.096 - - - -
0.3917 1610 0.1666 - - - -
0.3942 1620 0.1311 - - - -
0.3966 1630 0.0828 - - - -
0.3990 1640 0.0929 - - - -
0.4015 1650 0.1271 - - - -
0.4039 1660 0.0411 - - - -
0.4063 1670 0.0848 - - - -
0.4088 1680 0.2556 - - - -
0.4112 1690 0.1273 - - - -
0.4136 1700 0.1636 - - - -
0.4161 1710 0.0851 - - - -
0.4185 1720 0.1129 - - - -
0.4209 1730 0.1433 - - - -
0.4234 1740 0.1752 - - - -
0.4258 1750 0.1049 - - - -
0.4282 1760 0.1691 - - - -
0.4307 1770 0.2687 - - - -
0.4331 1780 0.1624 - - - -
0.4355 1790 0.1654 - - - -
0.4380 1800 0.1209 - - - -
0.4404 1810 0.2127 - - - -
0.4428 1820 0.0449 - - - -
0.4453 1830 0.0906 - - - -
0.4477 1840 0.1546 - - - -
0.4501 1850 0.0938 - - - -
0.4526 1860 0.1115 - - - -
0.4550 1870 0.0864 - - - -
0.4574 1880 0.1515 - - - -
0.4599 1890 0.091 - - - -
0.4623 1900 0.1496 - - - -
0.4647 1910 0.1807 - - - -
0.4672 1920 0.1351 - - - -
0.4696 1930 0.114 - - - -
0.4720 1940 0.1673 - - - -
0.4745 1950 0.1655 - - - -
0.4769 1960 0.0662 - - - -
0.4793 1970 0.1377 - - - -
0.4818 1980 0.0512 - - - -
0.4842 1990 0.1399 - - - -
0.4866 2000 0.1613 - - - -
0.4891 2010 0.1326 - - - -
0.4915 2020 0.1201 - - - -
0.4939 2030 0.097 - - - -
0.4964 2040 0.0788 - - - -
0.4988 2050 0.1282 - - - -
0.5012 2060 0.2038 - - - -
0.5036 2070 0.1078 - - - -
0.5061 2080 0.1594 - - - -
0.5085 2090 0.1628 - - - -
0.5109 2100 0.0744 - - - -
0.5134 2110 0.1587 - - - -
0.5158 2120 0.0573 - - - -
0.5182 2130 0.1672 - - - -
0.5207 2140 0.1139 - - - -
0.5231 2150 0.1285 - - - -
0.5255 2160 0.1538 - - - -
0.5280 2170 0.1642 - - - -
0.5304 2180 0.1012 - - - -
0.5328 2190 0.0554 - - - -
0.5353 2200 0.0656 - - - -
0.5377 2210 0.1206 - - - -
0.5401 2220 0.1164 - - - -
0.5426 2230 0.1364 - - - -
0.5450 2240 0.1188 - - - -
0.5474 2250 0.0965 - - - -
0.5499 2260 0.0789 - - - -
0.5523 2270 0.0793 - - - -
0.5547 2280 0.1205 - - - -
0.5572 2290 0.089 - - - -
0.5596 2300 0.1049 - - - -
0.5620 2310 0.0989 - - - -
0.5645 2320 0.1822 - - - -
0.5669 2330 0.1367 - - - -
0.5693 2340 0.1238 - - - -
0.5718 2350 0.1383 - - - -
0.5742 2360 0.184 - - - -
0.5766 2370 0.1254 - - - -
0.5791 2380 0.1046 - - - -
0.5815 2390 0.1175 - - - -
0.5839 2400 0.0698 - - - -
0.5864 2410 0.111 - - - -
0.5888 2420 0.115 - - - -
0.5912 2430 0.1721 - - - -
0.5937 2440 0.0904 - - - -
0.5961 2450 0.1142 - - - -
0.5985 2460 0.1021 - - - -
0.6010 2470 0.0307 - - - -
0.6034 2480 0.1495 - - - -
0.6058 2490 0.1031 - - - -
0.6083 2500 0.0951 - - - -
0.6107 2510 0.0941 - - - -
0.6131 2520 0.2231 - - - -
0.6156 2530 0.1572 - - - -
0.6180 2540 0.2004 - - - -
0.6204 2550 0.0573 - - - -
0.6229 2560 0.156 - - - -
0.6253 2570 0.1244 - - - -
0.6277 2580 0.0996 - - - -
0.6302 2590 0.163 - - - -
0.6326 2600 0.169 - - - -
0.6350 2610 0.1593 - - - -
0.6375 2620 0.098 - - - -
0.6399 2630 0.1133 - - - -
0.6423 2640 0.1267 - - - -
0.6448 2650 0.1006 - - - -
0.6472 2660 0.178 - - - -
0.6496 2670 0.1124 - - - -
0.6521 2680 0.0952 - - - -
0.6545 2690 0.0726 - - - -
0.6569 2700 0.1105 - - - -
0.6594 2710 0.1675 - - - -
0.6618 2720 0.1711 - - - -
0.6642 2730 0.1481 - - - -
0.6667 2740 0.1078 - - - -
0.6691 2750 0.0981 - - - -
0.6715 2760 0.115 - - - -
0.6740 2770 0.0855 - - - -
0.6764 2780 0.0657 - - - -
0.6788 2790 0.0539 - - - -
0.6813 2800 0.0766 - - - -
0.6837 2810 0.1608 - - - -
0.6861 2820 0.1263 - - - -
0.6886 2830 0.0992 - - - -
0.6910 2840 0.1147 - - - -
0.6934 2850 0.1697 - - - -
0.6959 2860 0.1602 - - - -
0.6983 2870 0.083 - - - -
0.7007 2880 0.1068 - - - -
0.7032 2890 0.1074 - - - -
0.7056 2900 0.0695 - - - -
0.7080 2910 0.0529 - - - -
0.7105 2920 0.1381 - - - -
0.7129 2930 0.1418 - - - -
0.7153 2940 0.1506 - - - -
0.7178 2950 0.1069 - - - -
0.7202 2960 0.147 - - - -
0.7226 2970 0.1358 - - - -
0.7251 2980 0.1592 - - - -
0.7275 2990 0.1387 - - - -
0.7299 3000 0.0886 - - - -
0.7324 3010 0.149 - - - -
0.7348 3020 0.1347 - - - -
0.7372 3030 0.1022 - - - -
0.7397 3040 0.0747 - - - -
0.7421 3050 0.0839 - - - -
0.7445 3060 0.1364 - - - -
0.7470 3070 0.1191 - - - -
0.7494 3080 0.0779 - - - -
0.7518 3090 0.0654 - - - -
0.7543 3100 0.0714 - - - -
0.7567 3110 0.1154 - - - -
0.7591 3120 0.0546 - - - -
0.7616 3130 0.0548 - - - -
0.7640 3140 0.0569 - - - -
0.7664 3150 0.0964 - - - -
0.7689 3160 0.0445 - - - -
0.7713 3170 0.1362 - - - -
0.7737 3180 0.1239 - - - -
0.7762 3190 0.0981 - - - -
0.7786 3200 0.0422 - - - -
0.7810 3210 0.1282 - - - -
0.7835 3220 0.0847 - - - -
0.7859 3230 0.1134 - - - -
0.7883 3240 0.1048 - - - -
0.7908 3250 0.1091 - - - -
0.7932 3260 0.0428 - - - -
0.7956 3270 0.0632 - - - -
0.7981 3280 0.0808 - - - -
0.8005 3290 0.0604 - - - -
0.8029 3300 0.1614 - - - -
0.8054 3310 0.1604 - - - -
0.8078 3320 0.0899 - - - -
0.8102 3330 0.1097 - - - -
0.8127 3340 0.1269 - - - -
0.8151 3350 0.0738 - - - -
0.8175 3360 0.0768 - - - -
0.8200 3370 0.0752 - - - -
0.8224 3380 0.1379 - - - -
0.8248 3390 0.0877 - - - -
0.8273 3400 0.1311 - - - -
0.8297 3410 0.1109 - - - -
0.8321 3420 0.1557 - - - -
0.8345 3430 0.1509 - - - -
0.8370 3440 0.0962 - - - -
0.8394 3450 0.0631 - - - -
0.8418 3460 0.0835 - - - -
0.8443 3470 0.1488 - - - -
0.8467 3480 0.0903 - - - -
0.8491 3490 0.0927 - - - -
0.8516 3500 0.1457 - - - -
0.8540 3510 0.0775 - - - -
0.8564 3520 0.1314 - - - -
0.8589 3530 0.1528 - - - -
0.8613 3540 0.0695 - - - -
0.8637 3550 0.0673 - - - -
0.8662 3560 0.1441 - - - -
0.8686 3570 0.135 - - - -
0.8710 3580 0.1595 - - - -
0.8735 3590 0.1125 - - - -
0.8759 3600 0.0709 - - - -
0.8783 3610 0.1191 - - - -
0.8808 3620 0.1614 - - - -
0.8832 3630 0.086 - - - -
0.8856 3640 0.0818 - - - -
0.8881 3650 0.0544 - - - -
0.8905 3660 0.0797 - - - -
0.8929 3670 0.0691 - - - -
0.8954 3680 0.0924 - - - -
0.8978 3690 0.0572 - - - -
0.9002 3700 0.0532 - - - -
0.9027 3710 0.1519 - - - -
0.9051 3720 0.0983 - - - -
0.9075 3730 0.0772 - - - -
0.9100 3740 0.18 - - - -
0.9124 3750 0.0485 - - - -
0.9148 3760 0.0872 - - - -
0.9173 3770 0.1069 - - - -
0.9197 3780 0.0657 - - - -
0.9221 3790 0.1811 - - - -
0.9246 3800 0.1038 - - - -
0.9270 3810 0.087 - - - -
0.9294 3820 0.1569 - - - -
0.9319 3830 0.0404 - - - -
0.9343 3840 0.1468 - - - -
0.9367 3850 0.0974 - - - -
0.9392 3860 0.1231 - - - -
0.9416 3870 0.1511 - - - -
0.9440 3880 0.0386 - - - -
0.9465 3890 0.0918 - - - -
0.9489 3900 0.0661 - - - -
0.9513 3910 0.1355 - - - -
0.9538 3920 0.1182 - - - -
0.9562 3930 0.1254 - - - -
0.9586 3940 0.1999 - - - -
0.9611 3950 0.125 - - - -
0.9635 3960 0.0303 - - - -
0.9659 3970 0.1192 - - - -
0.9684 3980 0.1182 - - - -
0.9708 3990 0.1449 - - - -
0.9732 4000 0.1387 - - - -
0.9757 4010 0.077 - - - -
0.9781 4020 0.1118 - - - -
0.9805 4030 0.0567 - - - -
0.9830 4040 0.0454 - - - -
0.9854 4050 0.1179 - - - -
0.9878 4060 0.0993 - - - -
0.9903 4070 0.1377 - - - -
0.9927 4080 0.1308 - - - -
0.9951 4090 0.0982 - - - -
0.9976 4100 0.1211 - - - -
1.0 4110 0.2036 0.0136 2.3842 0.0331 0.0606
1.0024 4120 0.1825 - - - -
1.0049 4130 0.1088 - - - -
1.0073 4140 0.1301 - - - -
1.0097 4150 0.0549 - - - -
1.0122 4160 0.0714 - - - -
1.0146 4170 0.0743 - - - -
1.0170 4180 0.0531 - - - -
1.0195 4190 0.0749 - - - -
1.0219 4200 0.0868 - - - -
1.0243 4210 0.0544 - - - -
1.0268 4220 0.0894 - - - -
1.0292 4230 0.0971 - - - -
1.0316 4240 0.0709 - - - -
1.0341 4250 0.055 - - - -
1.0365 4260 0.0386 - - - -
1.0389 4270 0.1549 - - - -
1.0414 4280 0.102 - - - -
1.0438 4290 0.0422 - - - -
1.0462 4300 0.0886 - - - -
1.0487 4310 0.0583 - - - -
1.0511 4320 0.0522 - - - -
1.0535 4330 0.0478 - - - -
1.0560 4340 0.0328 - - - -
1.0584 4350 0.028 - - - -
1.0608 4360 0.0129 - - - -
1.0633 4370 0.084 - - - -
1.0657 4380 0.0523 - - - -
1.0681 4390 0.1178 - - - -
1.0706 4400 0.0294 - - - -
1.0730 4410 0.0648 - - - -
1.0754 4420 0.0422 - - - -
1.0779 4430 0.0922 - - - -
1.0803 4440 0.0587 - - - -
1.0827 4450 0.0554 - - - -
1.0852 4460 0.0951 - - - -
1.0876 4470 0.108 - - - -
1.0900 4480 0.0677 - - - -
1.0925 4490 0.0737 - - - -
1.0949 4500 0.0447 - - - -
1.0973 4510 0.0531 - - - -
1.0998 4520 0.0605 - - - -
1.1022 4530 0.0871 - - - -
1.1046 4540 0.0718 - - - -
1.1071 4550 0.0672 - - - -
1.1095 4560 0.0829 - - - -
1.1119 4570 0.0539 - - - -
1.1144 4580 0.0751 - - - -
1.1168 4590 0.0521 - - - -
1.1192 4600 0.1046 - - - -
1.1217 4610 0.0631 - - - -
1.1241 4620 0.1142 - - - -
1.1265 4630 0.0556 - - - -
1.1290 4640 0.0398 - - - -
1.1314 4650 0.0817 - - - -
1.1338 4660 0.054 - - - -
1.1363 4670 0.12 - - - -
1.1387 4680 0.0762 - - - -
1.1411 4690 0.0138 - - - -
1.1436 4700 0.0777 - - - -
1.1460 4710 0.0582 - - - -
1.1484 4720 0.0721 - - - -
1.1509 4730 0.104 - - - -
1.1533 4740 0.087 - - - -
1.1557 4750 0.0842 - - - -
1.1582 4760 0.0416 - - - -
1.1606 4770 0.0806 - - - -
1.1630 4780 0.0588 - - - -
1.1655 4790 0.0291 - - - -
1.1679 4800 0.0638 - - - -
1.1703 4810 0.0837 - - - -
1.1727 4820 0.0702 - - - -
1.1752 4830 0.0442 - - - -
1.1776 4840 0.0528 - - - -
1.1800 4850 0.0601 - - - -
1.1825 4860 0.0344 - - - -
1.1849 4870 0.0443 - - - -
1.1873 4880 0.0383 - - - -
1.1898 4890 0.0359 - - - -
1.1922 4900 0.137 - - - -
1.1946 4910 0.0451 - - - -
1.1971 4920 0.0635 - - - -
1.1995 4930 0.0927 - - - -
1.2019 4940 0.0734 - - - -
1.2044 4950 0.0839 - - - -
1.2068 4960 0.1103 - - - -
1.2092 4970 0.0715 - - - -
1.2117 4980 0.0229 - - - -
1.2141 4990 0.0237 - - - -
1.2165 5000 0.0618 - - - -
1.2190 5010 0.0559 - - - -
1.2214 5020 0.0967 - - - -
1.2238 5030 0.0697 - - - -
1.2263 5040 0.0507 - - - -
1.2287 5050 0.0642 - - - -
1.2311 5060 0.0485 - - - -
1.2336 5070 0.0676 - - - -
1.2360 5080 0.1147 - - - -
1.2384 5090 0.061 - - - -
1.2409 5100 0.0333 - - - -
1.2433 5110 0.0334 - - - -
1.2457 5120 0.0751 - - - -
1.2482 5130 0.0942 - - - -
1.2506 5140 0.0609 - - - -
1.2530 5150 0.0983 - - - -
1.2555 5160 0.033 - - - -
1.2579 5170 0.0805 - - - -
1.2603 5180 0.0561 - - - -
1.2628 5190 0.0961 - - - -
1.2652 5200 0.0579 - - - -
1.2676 5210 0.0648 - - - -
1.2701 5220 0.0507 - - - -
1.2725 5230 0.0313 - - - -
1.2749 5240 0.0429 - - - -
1.2774 5250 0.0673 - - - -
1.2798 5260 0.0926 - - - -
1.2822 5270 0.0745 - - - -
1.2847 5280 0.0566 - - - -
1.2871 5290 0.0657 - - - -
1.2895 5300 0.0755 - - - -
1.2920 5310 0.0607 - - - -
1.2944 5320 0.0849 - - - -
1.2968 5330 0.106 - - - -
1.2993 5340 0.0283 - - - -
1.3017 5350 0.0628 - - - -
1.3041 5360 0.0603 - - - -
1.3066 5370 0.0616 - - - -
1.3090 5380 0.0463 - - - -
1.3114 5390 0.0546 - - - -
1.3139 5400 0.0492 - - - -
1.3163 5410 0.0555 - - - -
1.3187 5420 0.0817 - - - -
1.3212 5430 0.0876 - - - -
1.3236 5440 0.0379 - - - -
1.3260 5450 0.0788 - - - -
1.3285 5460 0.0751 - - - -
1.3309 5470 0.0366 - - - -
1.3333 5480 0.073 - - - -
1.3358 5490 0.0562 - - - -
1.3382 5500 0.0129 - - - -
1.3406 5510 0.0575 - - - -
1.3431 5520 0.0644 - - - -
1.3455 5530 0.0419 - - - -
1.3479 5540 0.0578 - - - -
1.3504 5550 0.0402 - - - -
1.3528 5560 0.0455 - - - -
1.3552 5570 0.0676 - - - -
1.3577 5580 0.0503 - - - -
1.3601 5590 0.0824 - - - -
1.3625 5600 0.0288 - - - -
1.3650 5610 0.1038 - - - -
1.3674 5620 0.0681 - - - -
1.3698 5630 0.0767 - - - -
1.3723 5640 0.0507 - - - -
1.3747 5650 0.0532 - - - -
1.3771 5660 0.1468 - - - -
1.3796 5670 0.0391 - - - -
1.3820 5680 0.0566 - - - -
1.3844 5690 0.0496 - - - -
1.3869 5700 0.0688 - - - -
1.3893 5710 0.062 - - - -
1.3917 5720 0.0834 - - - -
1.3942 5730 0.0611 - - - -
1.3966 5740 0.0593 - - - -
1.3990 5750 0.0664 - - - -
1.4015 5760 0.0841 - - - -
1.4039 5770 0.02 - - - -
1.4063 5780 0.0283 - - - -
1.4088 5790 0.1089 - - - -
1.4112 5800 0.0583 - - - -
1.4136 5810 0.0692 - - - -
1.4161 5820 0.0371 - - - -
1.4185 5830 0.0575 - - - -
1.4209 5840 0.0822 - - - -
1.4234 5850 0.1046 - - - -
1.4258 5860 0.0509 - - - -
1.4282 5870 0.0943 - - - -
1.4307 5880 0.1221 - - - -
1.4331 5890 0.0651 - - - -
1.4355 5900 0.0701 - - - -
1.4380 5910 0.0638 - - - -
1.4404 5920 0.1021 - - - -
1.4428 5930 0.0386 - - - -
1.4453 5940 0.0697 - - - -
1.4477 5950 0.064 - - - -
1.4501 5960 0.0522 - - - -
1.4526 5970 0.075 - - - -
1.4550 5980 0.0383 - - - -
1.4574 5990 0.0818 - - - -
1.4599 6000 0.0472 - - - -
1.4623 6010 0.0783 - - - -
1.4647 6020 0.0517 - - - -
1.4672 6030 0.046 - - - -
1.4696 6040 0.0759 - - - -
1.4720 6050 0.0645 - - - -
1.4745 6060 0.0794 - - - -
1.4769 6070 0.0396 - - - -
1.4793 6080 0.0524 - - - -
1.4818 6090 0.0116 - - - -
1.4842 6100 0.0657 - - - -
1.4866 6110 0.0728 - - - -
1.4891 6120 0.0663 - - - -
1.4915 6130 0.0965 - - - -
1.4939 6140 0.0535 - - - -
1.4964 6150 0.0389 - - - -
1.4988 6160 0.0976 - - - -
1.5012 6170 0.1219 - - - -
1.5036 6180 0.0488 - - - -
1.5061 6190 0.1015 - - - -
1.5085 6200 0.0982 - - - -
1.5109 6210 0.0565 - - - -
1.5134 6220 0.0831 - - - -
1.5158 6230 0.0463 - - - -
1.5182 6240 0.1356 - - - -
1.5207 6250 0.0567 - - - -
1.5231 6260 0.0459 - - - -
1.5255 6270 0.0767 - - - -
1.5280 6280 0.0798 - - - -
1.5304 6290 0.0632 - - - -
1.5328 6300 0.0431 - - - -
1.5353 6310 0.0175 - - - -
1.5377 6320 0.0482 - - - -
1.5401 6330 0.0841 - - - -
1.5426 6340 0.0756 - - - -
1.5450 6350 0.078 - - - -
1.5474 6360 0.0608 - - - -
1.5499 6370 0.0678 - - - -
1.5523 6380 0.054 - - - -
1.5547 6390 0.0823 - - - -
1.5572 6400 0.0322 - - - -
1.5596 6410 0.0432 - - - -
1.5620 6420 0.0251 - - - -
1.5645 6430 0.0349 - - - -
1.5669 6440 0.0591 - - - -
1.5693 6450 0.095 - - - -
1.5718 6460 0.0654 - - - -
1.5742 6470 0.1019 - - - -
1.5766 6480 0.0418 - - - -
1.5791 6490 0.038 - - - -
1.5815 6500 0.0884 - - - -
1.5839 6510 0.0439 - - - -
1.5864 6520 0.0704 - - - -
1.5888 6530 0.0664 - - - -
1.5912 6540 0.0776 - - - -
1.5937 6550 0.0295 - - - -
1.5961 6560 0.0735 - - - -
1.5985 6570 0.0668 - - - -
1.6010 6580 0.0202 - - - -
1.6034 6590 0.0638 - - - -
1.6058 6600 0.0705 - - - -
1.6083 6610 0.0558 - - - -
1.6107 6620 0.0474 - - - -
1.6131 6630 0.1205 - - - -
1.6156 6640 0.0995 - - - -
1.6180 6650 0.0837 - - - -
1.6204 6660 0.0146 - - - -
1.6229 6670 0.0445 - - - -
1.6253 6680 0.0797 - - - -
1.6277 6690 0.0484 - - - -
1.6302 6700 0.0699 - - - -
1.6326 6710 0.0832 - - - -
1.6350 6720 0.0718 - - - -
1.6375 6730 0.0552 - - - -
1.6399 6740 0.0694 - - - -
1.6423 6750 0.0937 - - - -
1.6448 6760 0.068 - - - -
1.6472 6770 0.081 - - - -
1.6496 6780 0.069 - - - -
1.6521 6790 0.0253 - - - -
1.6545 6800 0.0411 - - - -
1.6569 6810 0.0496 - - - -
1.6594 6820 0.0868 - - - -
1.6618 6830 0.1038 - - - -
1.6642 6840 0.0789 - - - -
1.6667 6850 0.0385 - - - -
1.6691 6860 0.0467 - - - -
1.6715 6870 0.0699 - - - -
1.6740 6880 0.0553 - - - -
1.6764 6890 0.0439 - - - -
1.6788 6900 0.0426 - - - -
1.6813 6910 0.0337 - - - -
1.6837 6920 0.0668 - - - -
1.6861 6930 0.1154 - - - -
1.6886 6940 0.0544 - - - -
1.6910 6950 0.076 - - - -
1.6934 6960 0.0725 - - - -
1.6959 6970 0.1054 - - - -
1.6983 6980 0.0595 - - - -
1.7007 6990 0.0569 - - - -
1.7032 7000 0.075 - - - -
1.7056 7010 0.0664 - - - -
1.7080 7020 0.0363 - - - -
1.7105 7030 0.0685 - - - -
1.7129 7040 0.1046 - - - -
1.7153 7050 0.1213 - - - -
1.7178 7060 0.0692 - - - -
1.7202 7070 0.0937 - - - -
1.7226 7080 0.0795 - - - -
1.7251 7090 0.1151 - - - -
1.7275 7100 0.0604 - - - -
1.7299 7110 0.0719 - - - -
1.7324 7120 0.0456 - - - -
1.7348 7130 0.0431 - - - -
1.7372 7140 0.0706 - - - -
1.7397 7150 0.0568 - - - -
1.7421 7160 0.0664 - - - -
1.7445 7170 0.0706 - - - -
1.7470 7180 0.0558 - - - -
1.7494 7190 0.0526 - - - -
1.7518 7200 0.0426 - - - -
1.7543 7210 0.0602 - - - -
1.7567 7220 0.0664 - - - -
1.7591 7230 0.0236 - - - -
1.7616 7240 0.0321 - - - -
1.7640 7250 0.0192 - - - -
1.7664 7260 0.0523 - - - -
1.7689 7270 0.0377 - - - -
1.7713 7280 0.0878 - - - -
1.7737 7290 0.0751 - - - -
1.7762 7300 0.0664 - - - -
1.7786 7310 0.0178 - - - -
1.7810 7320 0.0668 - - - -
1.7835 7330 0.0341 - - - -
1.7859 7340 0.0747 - - - -
1.7883 7350 0.0541 - - - -
1.7908 7360 0.067 - - - -
1.7932 7370 0.0315 - - - -
1.7956 7380 0.0576 - - - -
1.7981 7390 0.0542 - - - -
1.8005 7400 0.0496 - - - -
1.8029 7410 0.0919 - - - -
1.8054 7420 0.0877 - - - -
1.8078 7430 0.047 - - - -
1.8102 7440 0.0859 - - - -
1.8127 7450 0.0671 - - - -
1.8151 7460 0.0484 - - - -
1.8175 7470 0.0698 - - - -
1.8200 7480 0.0536 - - - -
1.8224 7490 0.0583 - - - -
1.8248 7500 0.0768 - - - -
1.8273 7510 0.0643 - - - -
1.8297 7520 0.0699 - - - -
1.8321 7530 0.0855 - - - -
1.8345 7540 0.1032 - - - -
1.8370 7550 0.0707 - - - -
1.8394 7560 0.0352 - - - -
1.8418 7570 0.0503 - - - -
1.8443 7580 0.0736 - - - -
1.8467 7590 0.0543 - - - -
1.8491 7600 0.0808 - - - -
1.8516 7610 0.0945 - - - -
1.8540 7620 0.0433 - - - -
1.8564 7630 0.0907 - - - -
1.8589 7640 0.0914 - - - -
1.8613 7650 0.0424 - - - -
1.8637 7660 0.0614 - - - -
1.8662 7670 0.1035 - - - -
1.8686 7680 0.0734 - - - -
1.8710 7690 0.0926 - - - -
1.8735 7700 0.0756 - - - -
1.8759 7710 0.0406 - - - -
1.8783 7720 0.0985 - - - -
1.8808 7730 0.0984 - - - -
1.8832 7740 0.0425 - - - -
1.8856 7750 0.0519 - - - -
1.8881 7760 0.0508 - - - -
1.8905 7770 0.0372 - - - -
1.8929 7780 0.0582 - - - -
1.8954 7790 0.0589 - - - -
1.8978 7800 0.0356 - - - -
1.9002 7810 0.0334 - - - -
1.9027 7820 0.052 - - - -
1.9051 7830 0.0696 - - - -
1.9075 7840 0.0684 - - - -
1.9100 7850 0.1165 - - - -
1.9124 7860 0.0419 - - - -
1.9148 7870 0.0706 - - - -
1.9173 7880 0.0609 - - - -
1.9197 7890 0.0283 - - - -
1.9221 7900 0.0722 - - - -
1.9246 7910 0.0866 - - - -
1.9270 7920 0.0671 - - - -
1.9294 7930 0.0753 - - - -
1.9319 7940 0.0265 - - - -
1.9343 7950 0.0934 - - - -
1.9367 7960 0.0661 - - - -
1.9392 7970 0.0798 - - - -
1.9416 7980 0.0953 - - - -
1.9440 7990 0.0078 - - - -
1.9465 8000 0.0631 - - - -
1.9489 8010 0.059 - - - -
1.9513 8020 0.0951 - - - -
1.9538 8030 0.0726 - - - -
1.9562 8040 0.0837 - - - -
1.9586 8050 0.113 - - - -
1.9611 8060 0.0732 - - - -
1.9635 8070 0.0227 - - - -
1.9659 8080 0.0766 - - - -
1.9684 8090 0.0684 - - - -
1.9708 8100 0.0923 - - - -
1.9732 8110 0.0949 - - - -
1.9757 8120 0.06 - - - -
1.9781 8130 0.0832 - - - -
1.9805 8140 0.0387 - - - -
1.9830 8150 0.0307 - - - -
1.9854 8160 0.0728 - - - -
1.9878 8170 0.0708 - - - -
1.9903 8180 0.1074 - - - -
1.9927 8190 0.0625 - - - -
1.9951 8200 0.0645 - - - -
1.9976 8210 0.0818 - - - -
2.0 8220 0.114 0.0109 2.2181 0.0328 0.0483
2.0024 8230 0.1097 - - - -
2.0049 8240 0.0758 - - - -
2.0073 8250 0.0848 - - - -
2.0097 8260 0.0365 - - - -
2.0122 8270 0.0404 - - - -
2.0146 8280 0.0462 - - - -
2.0170 8290 0.022 - - - -
2.0195 8300 0.0633 - - - -
2.0219 8310 0.0335 - - - -
2.0243 8320 0.0332 - - - -
2.0268 8330 0.0807 - - - -
2.0292 8340 0.0643 - - - -
2.0316 8350 0.0233 - - - -
2.0341 8360 0.0089 - - - -
2.0365 8370 0.0153 - - - -
2.0389 8380 0.0939 - - - -
2.0414 8390 0.0779 - - - -
2.0438 8400 0.0342 - - - -
2.0462 8410 0.0741 - - - -
2.0487 8420 0.0602 - - - -
2.0511 8430 0.0463 - - - -
2.0535 8440 0.0382 - - - -
2.0560 8450 0.0323 - - - -
2.0584 8460 0.0266 - - - -
2.0608 8470 0.0018 - - - -
2.0633 8480 0.0381 - - - -
2.0657 8490 0.0456 - - - -
2.0681 8500 0.0965 - - - -
2.0706 8510 0.0264 - - - -
2.0730 8520 0.0504 - - - -
2.0754 8530 0.0251 - - - -
2.0779 8540 0.0743 - - - -
2.0803 8550 0.0544 - - - -
2.0827 8560 0.0296 - - - -
2.0852 8570 0.0788 - - - -
2.0876 8580 0.0695 - - - -
2.0900 8590 0.049 - - - -
2.0925 8600 0.0468 - - - -
2.0949 8610 0.0398 - - - -
2.0973 8620 0.0371 - - - -
2.0998 8630 0.0512 - - - -
2.1022 8640 0.0699 - - - -
2.1046 8650 0.0531 - - - -
2.1071 8660 0.0601 - - - -
2.1095 8670 0.0666 - - - -
2.1119 8680 0.0499 - - - -
2.1144 8690 0.0437 - - - -
2.1168 8700 0.0445 - - - -
2.1192 8710 0.0548 - - - -
2.1217 8720 0.047 - - - -
2.1241 8730 0.0683 - - - -
2.1265 8740 0.0344 - - - -
2.1290 8750 0.0305 - - - -
2.1314 8760 0.048 - - - -
2.1338 8770 0.0402 - - - -
2.1363 8780 0.0727 - - - -
2.1387 8790 0.0235 - - - -
2.1411 8800 0.0088 - - - -
2.1436 8810 0.0602 - - - -
2.1460 8820 0.028 - - - -
2.1484 8830 0.0699 - - - -
2.1509 8840 0.0866 - - - -
2.1533 8850 0.0819 - - - -
2.1557 8860 0.0501 - - - -
2.1582 8870 0.0329 - - - -
2.1606 8880 0.0735 - - - -
2.1630 8890 0.0531 - - - -
2.1655 8900 0.0223 - - - -
2.1679 8910 0.0546 - - - -
2.1703 8920 0.0451 - - - -
2.1727 8930 0.047 - - - -
2.1752 8940 0.0244 - - - -
2.1776 8950 0.0378 - - - -
2.1800 8960 0.0182 - - - -
2.1825 8970 0.0224 - - - -
2.1849 8980 0.0327 - - - -
2.1873 8990 0.0323 - - - -
2.1898 9000 0.0307 - - - -
2.1922 9010 0.0874 - - - -
2.1946 9020 0.0407 - - - -
2.1971 9030 0.0502 - - - -
2.1995 9040 0.0474 - - - -
2.2019 9050 0.0437 - - - -
2.2044 9060 0.058 - - - -
2.2068 9070 0.0851 - - - -
2.2092 9080 0.0584 - - - -
2.2117 9090 0.0124 - - - -
2.2141 9100 0.0085 - - - -
2.2165 9110 0.0607 - - - -
2.2190 9120 0.0685 - - - -
2.2214 9130 0.0807 - - - -
2.2238 9140 0.0608 - - - -
2.2263 9150 0.0131 - - - -
2.2287 9160 0.0451 - - - -
2.2311 9170 0.0368 - - - -
2.2336 9180 0.0527 - - - -
2.2360 9190 0.0846 - - - -
2.2384 9200 0.0328 - - - -
2.2409 9210 0.0178 - - - -
2.2433 9220 0.0274 - - - -
2.2457 9230 0.0567 - - - -
2.2482 9240 0.0756 - - - -
2.2506 9250 0.0369 - - - -
2.2530 9260 0.0827 - - - -
2.2555 9270 0.023 - - - -
2.2579 9280 0.0749 - - - -
2.2603 9290 0.048 - - - -
2.2628 9300 0.0855 - - - -
2.2652 9310 0.0421 - - - -
2.2676 9320 0.0437 - - - -
2.2701 9330 0.0503 - - - -
2.2725 9340 0.0186 - - - -
2.2749 9350 0.0321 - - - -
2.2774 9360 0.0756 - - - -
2.2798 9370 0.0692 - - - -
2.2822 9380 0.0629 - - - -
2.2847 9390 0.0526 - - - -
2.2871 9400 0.0486 - - - -
2.2895 9410 0.0419 - - - -
2.2920 9420 0.0121 - - - -
2.2944 9430 0.0678 - - - -
2.2968 9440 0.0896 - - - -
2.2993 9450 0.0306 - - - -
2.3017 9460 0.0541 - - - -
2.3041 9470 0.0504 - - - -
2.3066 9480 0.0414 - - - -
2.3090 9490 0.0302 - - - -
2.3114 9500 0.0434 - - - -
2.3139 9510 0.0449 - - - -
2.3163 9520 0.0359 - - - -
2.3187 9530 0.0547 - - - -
2.3212 9540 0.0824 - - - -
2.3236 9550 0.0311 - - - -
2.3260 9560 0.0722 - - - -
2.3285 9570 0.0558 - - - -
2.3309 9580 0.0304 - - - -
2.3333 9590 0.0678 - - - -
2.3358 9600 0.0466 - - - -
2.3382 9610 0.0063 - - - -
2.3406 9620 0.04 - - - -
2.3431 9630 0.0579 - - - -
2.3455 9640 0.0286 - - - -
2.3479 9650 0.0473 - - - -
2.3504 9660 0.0395 - - - -
2.3528 9670 0.0344 - - - -
2.3552 9680 0.0399 - - - -
2.3577 9690 0.0391 - - - -
2.3601 9700 0.0393 - - - -
2.3625 9710 0.0185 - - - -
2.3650 9720 0.071 - - - -
2.3674 9730 0.0431 - - - -
2.3698 9740 0.0525 - - - -
2.3723 9750 0.0459 - - - -
2.3747 9760 0.0391 - - - -
2.3771 9770 0.1035 - - - -
2.3796 9780 0.0356 - - - -
2.3820 9790 0.0418 - - - -
2.3844 9800 0.0316 - - - -
2.3869 9810 0.053 - - - -
2.3893 9820 0.0489 - - - -
2.3917 9830 0.0603 - - - -
2.3942 9840 0.0422 - - - -
2.3966 9850 0.0491 - - - -
2.3990 9860 0.0441 - - - -
2.4015 9870 0.0773 - - - -
2.4039 9880 0.0172 - - - -
2.4063 9890 0.0274 - - - -
2.4088 9900 0.0776 - - - -
2.4112 9910 0.0446 - - - -
2.4136 9920 0.0502 - - - -
2.4161 9930 0.0321 - - - -
2.4185 9940 0.0342 - - - -
2.4209 9950 0.072 - - - -
2.4234 9960 0.0759 - - - -
2.4258 9970 0.04 - - - -
2.4282 9980 0.0703 - - - -
2.4307 9990 0.0674 - - - -
2.4331 10000 0.046 - - - -
2.4355 10010 0.0412 - - - -
2.4380 10020 0.0518 - - - -
2.4404 10030 0.0678 - - - -
2.4428 10040 0.0352 - - - -
2.4453 10050 0.0597 - - - -
2.4477 10060 0.0271 - - - -
2.4501 10070 0.0425 - - - -
2.4526 10080 0.0697 - - - -
2.4550 10090 0.0134 - - - -
2.4574 10100 0.0704 - - - -
2.4599 10110 0.027 - - - -
2.4623 10120 0.0523 - - - -
2.4647 10130 0.0373 - - - -
2.4672 10140 0.0276 - - - -
2.4696 10150 0.0715 - - - -
2.4720 10160 0.0538 - - - -
2.4745 10170 0.0598 - - - -
2.4769 10180 0.0349 - - - -
2.4793 10190 0.047 - - - -
2.4818 10200 0.0048 - - - -
2.4842 10210 0.0542 - - - -
2.4866 10220 0.0547 - - - -
2.4891 10230 0.0622 - - - -
2.4915 10240 0.0784 - - - -
2.4939 10250 0.0428 - - - -
2.4964 10260 0.0284 - - - -
2.4988 10270 0.0744 - - - -
2.5012 10280 0.0763 - - - -
2.5036 10290 0.0495 - - - -
2.5061 10300 0.0802 - - - -
2.5085 10310 0.077 - - - -
2.5109 10320 0.0376 - - - -
2.5134 10330 0.058 - - - -
2.5158 10340 0.044 - - - -
2.5182 10350 0.1121 - - - -
2.5207 10360 0.0354 - - - -
2.5231 10370 0.0267 - - - -
2.5255 10380 0.0445 - - - -
2.5280 10390 0.0536 - - - -
2.5304 10400 0.0539 - - - -
2.5328 10410 0.0353 - - - -
2.5353 10420 0.0147 - - - -
2.5377 10430 0.0319 - - - -
2.5401 10440 0.0676 - - - -
2.5426 10450 0.0395 - - - -
2.5450 10460 0.0648 - - - -
2.5474 10470 0.055 - - - -
2.5499 10480 0.0625 - - - -
2.5523 10490 0.04 - - - -
2.5547 10500 0.0678 - - - -
2.5572 10510 0.0251 - - - -
2.5596 10520 0.036 - - - -
2.5620 10530 0.0352 - - - -
2.5645 10540 0.0212 - - - -
2.5669 10550 0.0459 - - - -
2.5693 10560 0.0678 - - - -
2.5718 10570 0.053 - - - -
2.5742 10580 0.0888 - - - -
2.5766 10590 0.0374 - - - -
2.5791 10600 0.017 - - - -
2.5815 10610 0.0828 - - - -
2.5839 10620 0.0393 - - - -
2.5864 10630 0.0517 - - - -
2.5888 10640 0.0572 - - - -
2.5912 10650 0.0577 - - - -
2.5937 10660 0.0245 - - - -
2.5961 10670 0.0632 - - - -
2.5985 10680 0.0612 - - - -
2.6010 10690 0.0204 - - - -
2.6034 10700 0.0493 - - - -
2.6058 10710 0.0613 - - - -
2.6083 10720 0.0467 - - - -
2.6107 10730 0.0532 - - - -
2.6131 10740 0.0962 - - - -
2.6156 10750 0.048 - - - -
2.6180 10760 0.0623 - - - -
2.6204 10770 0.0049 - - - -
2.6229 10780 0.0359 - - - -
2.6253 10790 0.0536 - - - -
2.6277 10800 0.0423 - - - -
2.6302 10810 0.0306 - - - -
2.6326 10820 0.0412 - - - -
2.6350 10830 0.0559 - - - -
2.6375 10840 0.0574 - - - -
2.6399 10850 0.0521 - - - -
2.6423 10860 0.0638 - - - -
2.6448 10870 0.0476 - - - -
2.6472 10880 0.0715 - - - -
2.6496 10890 0.0453 - - - -
2.6521 10900 0.0115 - - - -
2.6545 10910 0.0339 - - - -
2.6569 10920 0.0436 - - - -
2.6594 10930 0.0613 - - - -
2.6618 10940 0.0697 - - - -
2.6642 10950 0.0666 - - - -
2.6667 10960 0.0183 - - - -
2.6691 10970 0.0405 - - - -
2.6715 10980 0.0607 - - - -
2.6740 10990 0.0327 - - - -
2.6764 11000 0.0367 - - - -
2.6788 11010 0.041 - - - -
2.6813 11020 0.0351 - - - -
2.6837 11030 0.0462 - - - -
2.6861 11040 0.1159 - - - -
2.6886 11050 0.0369 - - - -
2.6910 11060 0.0643 - - - -
2.6934 11070 0.0564 - - - -
2.6959 11080 0.0576 - - - -
2.6983 11090 0.061 - - - -
2.7007 11100 0.0513 - - - -
2.7032 11110 0.0674 - - - -
2.7056 11120 0.0658 - - - -
2.7080 11130 0.0182 - - - -
2.7105 11140 0.0585 - - - -
2.7129 11150 0.0825 - - - -
2.7153 11160 0.1078 - - - -
2.7178 11170 0.064 - - - -
2.7202 11180 0.0745 - - - -
2.7226 11190 0.0726 - - - -
2.7251 11200 0.0929 - - - -
2.7275 11210 0.0519 - - - -
2.7299 11220 0.0668 - - - -
2.7324 11230 0.0279 - - - -
2.7348 11240 0.0315 - - - -
2.7372 11250 0.0482 - - - -
2.7397 11260 0.0495 - - - -
2.7421 11270 0.0664 - - - -
2.7445 11280 0.0684 - - - -
2.7470 11290 0.0362 - - - -
2.7494 11300 0.0451 - - - -
2.7518 11310 0.0435 - - - -
2.7543 11320 0.0503 - - - -
2.7567 11330 0.053 - - - -
2.7591 11340 0.0198 - - - -
2.7616 11350 0.0289 - - - -
2.7640 11360 0.0137 - - - -
2.7664 11370 0.0468 - - - -
2.7689 11380 0.0349 - - - -
2.7713 11390 0.081 - - - -
2.7737 11400 0.0557 - - - -
2.7762 11410 0.0622 - - - -
2.7786 11420 0.0059 - - - -
2.7810 11430 0.0582 - - - -
2.7835 11440 0.022 - - - -
2.7859 11450 0.0539 - - - -
2.7883 11460 0.0329 - - - -
2.7908 11470 0.0616 - - - -
2.7932 11480 0.031 - - - -
2.7956 11490 0.0557 - - - -
2.7981 11500 0.0511 - - - -
2.8005 11510 0.0426 - - - -
2.8029 11520 0.0555 - - - -
2.8054 11530 0.0764 - - - -
2.8078 11540 0.0464 - - - -
2.8102 11550 0.0751 - - - -
2.8127 11560 0.0633 - - - -
2.8151 11570 0.0387 - - - -
2.8175 11580 0.0685 - - - -
2.8200 11590 0.0439 - - - -
2.8224 11600 0.0348 - - - -
2.8248 11610 0.0645 - - - -
2.8273 11620 0.0528 - - - -
2.8297 11630 0.0615 - - - -
2.8321 11640 0.0636 - - - -
2.8345 11650 0.0804 - - - -
2.8370 11660 0.0613 - - - -
2.8394 11670 0.0259 - - - -
2.8418 11680 0.0494 - - - -
2.8443 11690 0.036 - - - -
2.8467 11700 0.0453 - - - -
2.8491 11710 0.0762 - - - -
2.8516 11720 0.0829 - - - -
2.8540 11730 0.0434 - - - -
2.8564 11740 0.0691 - - - -
2.8589 11750 0.0594 - - - -
2.8613 11760 0.0345 - - - -
2.8637 11770 0.056 - - - -
2.8662 11780 0.0962 - - - -
2.8686 11790 0.0548 - - - -
2.8710 11800 0.0615 - - - -
2.8735 11810 0.0581 - - - -
2.8759 11820 0.0352 - - - -
2.8783 11830 0.0814 - - - -
2.8808 11840 0.0641 - - - -
2.8832 11850 0.0364 - - - -
2.8856 11860 0.0388 - - - -
2.8881 11870 0.0479 - - - -
2.8905 11880 0.0349 - - - -
2.8929 11890 0.0557 - - - -
2.8954 11900 0.0437 - - - -
2.8978 11910 0.0157 - - - -
2.9002 11920 0.0304 - - - -
2.9027 11930 0.0377 - - - -
2.9051 11940 0.0626 - - - -
2.9075 11950 0.0672 - - - -
2.9100 11960 0.0835 - - - -
2.9124 11970 0.0377 - - - -
2.9148 11980 0.0623 - - - -
2.9173 11990 0.0375 - - - -
2.9197 12000 0.0182 - - - -
2.9221 12010 0.0464 - - - -
2.9246 12020 0.074 - - - -
2.9270 12030 0.0604 - - - -
2.9294 12040 0.0447 - - - -
2.9319 12050 0.0231 - - - -
2.9343 12060 0.0759 - - - -
2.9367 12070 0.0592 - - - -
2.9392 12080 0.0412 - - - -
2.9416 12090 0.0554 - - - -
2.9440 12100 0.0086 - - - -
2.9465 12110 0.0605 - - - -
2.9489 12120 0.0522 - - - -
2.9513 12130 0.0822 - - - -
2.9538 12140 0.0603 - - - -
2.9562 12150 0.0762 - - - -
2.9586 12160 0.076 - - - -
2.9611 12170 0.0516 - - - -
2.9635 12180 0.0221 - - - -
2.9659 12190 0.0662 - - - -
2.9684 12200 0.0571 - - - -
2.9708 12210 0.0738 - - - -
2.9732 12220 0.0567 - - - -
2.9757 12230 0.0566 - - - -
2.9781 12240 0.077 - - - -
2.9805 12250 0.0353 - - - -
2.9830 12260 0.0313 - - - -
2.9854 12270 0.0628 - - - -
2.9878 12280 0.0536 - - - -
2.9903 12290 0.0972 - - - -
2.9927 12300 0.0393 - - - -
2.9951 12310 0.0461 - - - -
2.9976 12320 0.0585 - - - -
3.0 12330 0.0923 0.0108 2.1017 0.0314 0.0328

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.1.1
  • Transformers: 4.45.2
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.1.1
  • Datasets: 3.1.0
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}

ContrastiveLoss

@inproceedings{hadsell2006dimensionality,
    author={Hadsell, R. and Chopra, S. and LeCun, Y.},
    booktitle={2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)},
    title={Dimensionality Reduction by Learning an Invariant Mapping},
    year={2006},
    volume={2},
    number={},
    pages={1735-1742},
    doi={10.1109/CVPR.2006.100}
}

TripletLoss

@misc{hermans2017defense,
    title={In Defense of the Triplet Loss for Person Re-Identification},
    author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
    year={2017},
    eprint={1703.07737},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}
Downloads last month
7
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for kh-li/bert-base-all-nli-stsb-quora-nq

Finetuned
(2110)
this model

Datasets used to train kh-li/bert-base-all-nli-stsb-quora-nq