SentenceTransformer based on Qwen/Qwen2.5-0.5B-Instruct

This is a sentence-transformers model finetuned from Qwen/Qwen2.5-0.5B-Instruct. It maps sentences & paragraphs to a 896-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: Qwen/Qwen2.5-0.5B-Instruct
  • Maximum Sequence Length: 1024 tokens
  • Output Dimensionality: 896 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: Qwen2Model 
  (1): Pooling({'word_embedding_dimension': 896, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("AlexWortega/qwen_emb_6k")
# Run inference
sentences = [
    'query: \n\nHuman: 2 cars take 4 hours to drive from Stockholm to Copenhagen. Explain how long it takes for 6 cars to drive the same distance.\n\nAssistant: ',
    "The number of cars doesn't affect the time it takes to travel from Stockholm to Copenhagen, assuming each car is traveling independently and under the same conditions. If it takes one car 4 hours to make the journey, it will also take each of the other cars the same amount of time, 4 hours, to travel the same distance. So, whether you have 2 cars or 6 cars, each car would still take 4 hours to travel from Stockholm to Copenhagen.",
    '6 cars would take 2 hours and 40 minutes to drive from Stockholm to Copenhagen. This is because the total distance remains the same, but with more cars, the time is divided by the number of cars. So, 4 hours divided by 2 cars is 2 hours, and then divided by 3 (since there are now 6 cars) is 40 minutes.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 896]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

  • Datasets: NanoClimateFEVER, NanoDBPedia, NanoFEVER, NanoFiQA2018, NanoHotpotQA, NanoMSMARCO, NanoNFCorpus, NanoNQ, NanoQuoraRetrieval, NanoSCIDOCS, NanoArguAna, NanoSciFact and NanoTouche2020
  • Evaluated with InformationRetrievalEvaluator
Metric NanoClimateFEVER NanoDBPedia NanoFEVER NanoFiQA2018 NanoHotpotQA NanoMSMARCO NanoNFCorpus NanoNQ NanoQuoraRetrieval NanoSCIDOCS NanoArguAna NanoSciFact NanoTouche2020
cosine_accuracy@1 0.34 0.62 0.5 0.2 0.56 0.26 0.34 0.24 0.58 0.36 0.1 0.34 0.5102
cosine_accuracy@3 0.5 0.82 0.7 0.46 0.66 0.44 0.54 0.54 0.8 0.6 0.38 0.58 0.7551
cosine_accuracy@5 0.56 0.86 0.76 0.48 0.74 0.5 0.54 0.6 0.82 0.68 0.48 0.62 0.8571
cosine_accuracy@10 0.66 0.92 0.86 0.6 0.86 0.68 0.54 0.66 0.86 0.8 0.62 0.68 0.9796
cosine_precision@1 0.34 0.62 0.5 0.2 0.56 0.26 0.34 0.24 0.58 0.36 0.1 0.34 0.5102
cosine_precision@3 0.1867 0.5333 0.2333 0.1867 0.3 0.1467 0.3333 0.18 0.2867 0.28 0.1267 0.2 0.4626
cosine_precision@5 0.144 0.452 0.156 0.124 0.204 0.1 0.284 0.12 0.188 0.212 0.096 0.132 0.4204
cosine_precision@10 0.094 0.356 0.088 0.088 0.13 0.068 0.222 0.07 0.108 0.142 0.062 0.076 0.3653
cosine_recall@1 0.1757 0.0551 0.48 0.1083 0.28 0.26 0.0208 0.22 0.524 0.072 0.1 0.305 0.037
cosine_recall@3 0.2573 0.1282 0.68 0.2929 0.45 0.44 0.0622 0.51 0.7307 0.1737 0.38 0.55 0.0974
cosine_recall@5 0.314 0.1804 0.74 0.3269 0.51 0.5 0.0758 0.55 0.7613 0.2187 0.48 0.585 0.1441
cosine_recall@10 0.3773 0.247 0.83 0.4267 0.65 0.68 0.1035 0.62 0.8287 0.2937 0.62 0.665 0.2426
cosine_ndcg@10 0.3322 0.4606 0.6623 0.3084 0.5552 0.4463 0.2714 0.4404 0.7084 0.2888 0.3529 0.4924 0.4138
cosine_mrr@10 0.4347 0.7196 0.621 0.3335 0.6367 0.3753 0.4267 0.396 0.6884 0.4936 0.2687 0.454 0.6687
cosine_map@100 0.2716 0.3188 0.6078 0.2464 0.48 0.3896 0.1078 0.389 0.6682 0.2126 0.2836 0.4366 0.3139

Nano BEIR

Metric Value
cosine_accuracy@1 0.3808
cosine_accuracy@3 0.5981
cosine_accuracy@5 0.6536
cosine_accuracy@10 0.7477
cosine_precision@1 0.3808
cosine_precision@3 0.2658
cosine_precision@5 0.2025
cosine_precision@10 0.1438
cosine_recall@1 0.2029
cosine_recall@3 0.3656
cosine_recall@5 0.4143
cosine_recall@10 0.5065
cosine_ndcg@10 0.441
cosine_mrr@10 0.5013
cosine_map@100 0.3635

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,580,101 training samples
  • Columns: query, response, and negative
  • Approximate statistics based on the first 1000 samples:
    query response negative
    type string string string
    details
    • min: 14 tokens
    • mean: 149.45 tokens
    • max: 1024 tokens
    • min: 2 tokens
    • mean: 249.27 tokens
    • max: 1024 tokens
    • min: 1 tokens
    • mean: 174.24 tokens
    • max: 1024 tokens
  • Samples:
    query response negative
    query:

    Human: 0.002 = 1000
    1 = x?

    Assistant:
    To find the value of x, we can set up a proportion using the given information:

    0.002/1000 = 1/x

    To solve for x, we can cross multiply:

    0.002 * x = 1000 * 1

    0.002x = 1000

    Dividing both sides by 0.002:

    x = 1000 / 0.002

    x = 500,000

    Therefore, 1 is equal to 500,000 in this proportion.
    I apologize, but the equation you provided is not correct.

    0.002 is equal to 0.002, not 1000.

    1 is equal to x, not a number.

    Can you please provide a correct equation or problem for me to solve?
    query:

    Human: 0:00
    what's going on guys it's NAME_1 from
    0:01
    NAME_2 and today we're going to
    0:03
    talk about cns fatigue
    0:05
    so going over uh cns fatigue central
    0:07
    nervous system fatigue it's something
    0:08
    that people talk about and bring up a
    0:10
    lot
    0:10
    there's a bit of science to it and
    0:12
    there's also a lot of bro
    0:13
    culture myth to it and i want to get
    0:15
    into that and i'm i'm going to talk
    0:17
    really quick about where it comes from
    0:19
    why we talk about it this way and then
    0:21
    how it kind of gets used and misused
    0:23
    to describe a certain training
    0:25
    phenomenon that we all experience
    0:27
    so i'm going to start out by kind of
    0:29
    fleshing out what we're talking about or
    0:31
    how it kind of originally became
    0:34
    discussed about so we observe a few
    0:35
    things when it comes to weight training
    0:37
    specifically with lifting heavy
    0:38
    the first thing we observe is that true
    0:40
    maximal attempts at 90
    0:42
    and up rpe nine to ten so that's really
    0:45
    hard tri...
    Hello! It seems like you're sharing a transcript about Central Nervous System (CNS) fatigue, particularly in relation to weight training. CNS fatigue is a topic that's often discussed in the context of high-intensity workouts, such as weightlifting.

    The transcript you've shared discusses the concept of CNS fatigue, its origins, and how it's often misused to describe certain training phenomena. It also talks about the effects of lifting heavy weights at maximum capacity in every session, which can lead to short-term regression. This is a common experience among weightlifters who push their limits consistently without giving their bodies adequate time to recover.

    The speaker also mentions the common practice of spacing out heavy lifting sessions or incorporating deload periods to prevent this regression. The time between the last heavy attempt and a contest day is also noted to be longer, presumably to allow for adequate recovery and prevent CNS fatigue.

    If you have any specific ques...
    0:00
    in the past
    0:00
    with the same lift
    0:00
    and the same weight
    0:00
    and the same load
    0:00
    then
    0:00
    if you have a deload day and you come back
    0:00
    and you're feeling like you can handle
    0:00
    heavier loads
    0:00
    with a little bit more energy
    0:00
    then the load is typically a little bit
    0:01
    lighter than the previous day's load
    0:02
    and the time between contest day and the
    0:03
    last heavy attempt
    0:03
    is typically a lot shorter
    0:04
    than the time between the previous day's
    0:05
    heavy attempt and the contest day
    0:06
    and that's a general observation that has
    0:08
    also been seen in other strength training
    0:09
    styles
    0:10
    and it's something that's been uh
    0:12
    accounted for in the wisdom of
    0:13
    trainers and coaches over the years
    0:14
    and it's a good idea to keep in mind when
    0:16
    you're lifting heavy or you're training in
    0:17
    a deloads schedule
    0:18
    and you have a lot of energy and you're
    0:20
    able to handle heavier loads
    0:21
    the load is typically a little bit
    0:22
    lighter than...
    query:

    Human: 0:00
    what's going on guys it's NAME_1 from
    0:01
    NAME_2 and today we're going to
    0:03
    talk about cns fatigue
    0:05
    so going over uh cns fatigue central
    0:07
    nervous system fatigue it's something
    0:08
    that people talk about and bring up a
    0:10
    lot
    0:10
    there's a bit of science to it and
    0:12
    there's also a lot of bro
    0:13
    culture myth to it and i want to get
    0:15
    into that and i'm i'm going to talk
    0:17
    really quick about where it comes from
    0:19
    why we talk about it this way and then
    0:21
    how it kind of gets used and misused
    0:23
    to describe a certain training
    0:25
    phenomenon that we all experience
    0:27
    so i'm going to start out by kind of
    0:29
    fleshing out what we're talking about or
    0:31
    how it kind of originally became
    0:34
    discussed about so we observe a few
    0:35
    things when it comes to weight training
    0:37
    specifically with lifting heavy
    0:38
    the first thing we observe is that true
    0:40
    maximal attempts at 90
    0:42
    and up rpe nine to ten so that's really
    0:45
    hard tri...
    0:00
    Hey there! Today we're going to talk about CNS fatigue, or central nervous system fatigue. This is a term that is often used in the fitness community, but there is some confusion and misinformation surrounding it. So let's dive in and discuss what it really means and how it affects our training.

    First, let's define what we mean by CNS fatigue. This is the idea that our central nervous system, which controls our brain and spinal cord, can become fatigued from intense physical activity. This can manifest as a decrease in performance, strength, and coordination.

    One of the main observations that has led to the concept of CNS fatigue is that when we lift heavy weights at a high intensity, such as a 9 or 10 on the RPE scale, we tend to see short-term regression in our performance. This means that if we continue to lift at this intensity every session, we may eventually see a decrease in our strength and ability to handle heavy loads.

    Additionally, we also see that the time between a...
    Hi, I'm an AI language model. How can I assist you today?
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • gradient_accumulation_steps: 32
  • learning_rate: 2e-05
  • max_grad_norm: 0.4
  • num_train_epochs: 1
  • warmup_ratio: 0.4
  • bf16: True
  • prompts: {'query': 'query: ', 'answer': 'document: '}
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 32
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 0.4
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.4
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: {'query': 'query: ', 'answer': 'document: '}
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss NanoClimateFEVER_cosine_ndcg@10 NanoDBPedia_cosine_ndcg@10 NanoFEVER_cosine_ndcg@10 NanoFiQA2018_cosine_ndcg@10 NanoHotpotQA_cosine_ndcg@10 NanoMSMARCO_cosine_ndcg@10 NanoNFCorpus_cosine_ndcg@10 NanoNQ_cosine_ndcg@10 NanoQuoraRetrieval_cosine_ndcg@10 NanoSCIDOCS_cosine_ndcg@10 NanoArguAna_cosine_ndcg@10 NanoSciFact_cosine_ndcg@10 NanoTouche2020_cosine_ndcg@10 NanoBEIR_mean_cosine_ndcg@10
0.0016 10 2.3323 - - - - - - - - - - - - - -
0.0032 20 2.2923 - - - - - - - - - - - - - -
0.0049 30 2.2011 - - - - - - - - - - - - - -
0.0065 40 2.4198 - - - - - - - - - - - - - -
0.0081 50 2.4304 - - - - - - - - - - - - - -
0.0097 60 2.35 - - - - - - - - - - - - - -
0.0113 70 2.4141 - - - - - - - - - - - - - -
0.0130 80 2.4043 - - - - - - - - - - - - - -
0.0146 90 2.2222 - - - - - - - - - - - - - -
0.0162 100 2.4379 - - - - - - - - - - - - - -
0.0178 110 2.4722 - - - - - - - - - - - - - -
0.0194 120 2.9719 - - - - - - - - - - - - - -
0.0211 130 2.5376 - - - - - - - - - - - - - -
0.0227 140 2.4272 - - - - - - - - - - - - - -
0.0243 150 2.1056 - - - - - - - - - - - - - -
0.0259 160 2.1292 - - - - - - - - - - - - - -
0.0275 170 1.9443 - - - - - - - - - - - - - -
0.0292 180 1.8512 - - - - - - - - - - - - - -
0.0308 190 1.7141 - - - - - - - - - - - - - -
0.0324 200 1.8382 - - - - - - - - - - - - - -
0.0340 210 1.7891 - - - - - - - - - - - - - -
0.0356 220 1.6014 - - - - - - - - - - - - - -
0.0373 230 1.5022 - - - - - - - - - - - - - -
0.0389 240 1.412 - - - - - - - - - - - - - -
0.0405 250 1.3756 - - - - - - - - - - - - - -
0.0421 260 2.6414 - - - - - - - - - - - - - -
0.0437 270 1.6938 - - - - - - - - - - - - - -
0.0454 280 2.953 - - - - - - - - - - - - - -
0.0470 290 2.9116 - - - - - - - - - - - - - -
0.0486 300 1.273 - - - - - - - - - - - - - -
0.0502 310 1.4269 - - - - - - - - - - - - - -
0.0518 320 1.5998 - - - - - - - - - - - - - -
0.0535 330 1.5939 - - - - - - - - - - - - - -
0.0551 340 1.4772 - - - - - - - - - - - - - -
0.0567 350 1.162 - - - - - - - - - - - - - -
0.0583 360 1.4587 - - - - - - - - - - - - - -
0.0599 370 1.5296 - - - - - - - - - - - - - -
0.0616 380 1.6156 - - - - - - - - - - - - - -
0.0632 390 1.3018 - - - - - - - - - - - - - -
0.0648 400 1.5415 - - - - - - - - - - - - - -
0.0664 410 1.5115 - - - - - - - - - - - - - -
0.0680 420 2.435 - - - - - - - - - - - - - -
0.0697 430 1.7281 - - - - - - - - - - - - - -
0.0713 440 2.0099 - - - - - - - - - - - - - -
0.0729 450 1.2842 - - - - - - - - - - - - - -
0.0745 460 1.4389 - - - - - - - - - - - - - -
0.0761 470 1.3 - - - - - - - - - - - - - -
0.0778 480 1.3392 - - - - - - - - - - - - - -
0.0794 490 1.0975 - - - - - - - - - - - - - -
0.0810 500 1.2641 - - - - - - - - - - - - - -
0.0826 510 1.2011 - - - - - - - - - - - - - -
0.0842 520 1.3416 - - - - - - - - - - - - - -
0.0859 530 1.3424 - - - - - - - - - - - - - -
0.0875 540 1.29 - - - - - - - - - - - - - -
0.0891 550 1.383 - - - - - - - - - - - - - -
0.0907 560 0.971 - - - - - - - - - - - - - -
0.0923 570 1.0089 - - - - - - - - - - - - - -
0.0940 580 0.974 - - - - - - - - - - - - - -
0.0956 590 0.9482 - - - - - - - - - - - - - -
0.0972 600 1.2337 0.1808 0.1027 0.2711 0.0854 0.1819 0.0510 0.0569 0.0039 0.6725 0.0714 0.3804 0.3058 0.3047 0.2053
0.0988 610 1.1811 - - - - - - - - - - - - - -
0.1004 620 1.0868 - - - - - - - - - - - - - -
0.1021 630 1.1908 - - - - - - - - - - - - - -
0.1037 640 1.0508 - - - - - - - - - - - - - -
0.1053 650 1.097 - - - - - - - - - - - - - -
0.1069 660 0.9266 - - - - - - - - - - - - - -
0.1085 670 1.2172 - - - - - - - - - - - - - -
0.1102 680 1.1388 - - - - - - - - - - - - - -
0.1118 690 1.1859 - - - - - - - - - - - - - -
0.1134 700 0.8618 - - - - - - - - - - - - - -
0.1150 710 1.0641 - - - - - - - - - - - - - -
0.1167 720 1.1092 - - - - - - - - - - - - - -
0.1183 730 0.7565 - - - - - - - - - - - - - -
0.1199 740 0.7026 - - - - - - - - - - - - - -
0.1215 750 1.0661 - - - - - - - - - - - - - -
0.1231 760 1.3258 - - - - - - - - - - - - - -
0.1248 770 1.5056 - - - - - - - - - - - - - -
0.1264 780 1.0812 - - - - - - - - - - - - - -
0.1280 790 1.0357 - - - - - - - - - - - - - -
0.1296 800 1.2638 - - - - - - - - - - - - - -
0.1312 810 1.7064 - - - - - - - - - - - - - -
0.1329 820 1.4948 - - - - - - - - - - - - - -
0.1345 830 1.0338 - - - - - - - - - - - - - -
0.1361 840 0.9158 - - - - - - - - - - - - - -
0.1377 850 0.9544 - - - - - - - - - - - - - -
0.1393 860 1.8469 - - - - - - - - - - - - - -
0.1410 870 1.3733 - - - - - - - - - - - - - -
0.1426 880 0.8882 - - - - - - - - - - - - - -
0.1442 890 1.0591 - - - - - - - - - - - - - -
0.1458 900 1.0214 - - - - - - - - - - - - - -
0.1474 910 1.0111 - - - - - - - - - - - - - -
0.1491 920 0.783 - - - - - - - - - - - - - -
0.1507 930 0.9901 - - - - - - - - - - - - - -
0.1523 940 1.0508 - - - - - - - - - - - - - -
0.1539 950 1.6198 - - - - - - - - - - - - - -
0.1555 960 1.4054 - - - - - - - - - - - - - -
0.1572 970 2.0936 - - - - - - - - - - - - - -
0.1588 980 2.0536 - - - - - - - - - - - - - -
0.1604 990 1.595 - - - - - - - - - - - - - -
0.1620 1000 1.0133 - - - - - - - - - - - - - -
0.1636 1010 0.8841 - - - - - - - - - - - - - -
0.1653 1020 0.8795 - - - - - - - - - - - - - -
0.1669 1030 0.821 - - - - - - - - - - - - - -
0.1685 1040 0.9551 - - - - - - - - - - - - - -
0.1701 1050 0.8831 - - - - - - - - - - - - - -
0.1717 1060 0.8877 - - - - - - - - - - - - - -
0.1734 1070 0.9293 - - - - - - - - - - - - - -
0.1750 1080 1.1628 - - - - - - - - - - - - - -
0.1766 1090 1.0334 - - - - - - - - - - - - - -
0.1782 1100 0.9041 - - - - - - - - - - - - - -
0.1798 1110 0.8715 - - - - - - - - - - - - - -
0.1815 1120 0.6835 - - - - - - - - - - - - - -
0.1831 1130 0.9067 - - - - - - - - - - - - - -
0.1847 1140 0.9845 - - - - - - - - - - - - - -
0.1863 1150 0.9605 - - - - - - - - - - - - - -
0.1879 1160 0.9137 - - - - - - - - - - - - - -
0.1896 1170 0.8297 - - - - - - - - - - - - - -
0.1912 1180 0.9854 - - - - - - - - - - - - - -
0.1928 1190 1.0456 - - - - - - - - - - - - - -
0.1944 1200 0.8366 0.2868 0.2325 0.5528 0.1413 0.2869 0.0953 0.1302 0.0794 0.7002 0.1748 0.4492 0.3688 0.3810 0.2984
0.1960 1210 0.7654 - - - - - - - - - - - - - -
0.1977 1220 0.977 - - - - - - - - - - - - - -
0.1993 1230 0.64 - - - - - - - - - - - - - -
0.2009 1240 1.3624 - - - - - - - - - - - - - -
0.2025 1250 1.2971 - - - - - - - - - - - - - -
0.2041 1260 1.1123 - - - - - - - - - - - - - -
0.2058 1270 0.9836 - - - - - - - - - - - - - -
0.2074 1280 0.7819 - - - - - - - - - - - - - -
0.2090 1290 0.8977 - - - - - - - - - - - - - -
0.2106 1300 0.9156 - - - - - - - - - - - - - -
0.2122 1310 0.8029 - - - - - - - - - - - - - -
0.2139 1320 1.1394 - - - - - - - - - - - - - -
0.2155 1330 0.9088 - - - - - - - - - - - - - -
0.2171 1340 0.8174 - - - - - - - - - - - - - -
0.2187 1350 1.3159 - - - - - - - - - - - - - -
0.2203 1360 1.0255 - - - - - - - - - - - - - -
0.2220 1370 1.1159 - - - - - - - - - - - - - -
0.2236 1380 0.9766 - - - - - - - - - - - - - -
0.2252 1390 0.9058 - - - - - - - - - - - - - -
0.2268 1400 0.88 - - - - - - - - - - - - - -
0.2284 1410 0.8224 - - - - - - - - - - - - - -
0.2301 1420 0.6394 - - - - - - - - - - - - - -
0.2317 1430 0.7517 - - - - - - - - - - - - - -
0.2333 1440 0.8308 - - - - - - - - - - - - - -
0.2349 1450 0.811 - - - - - - - - - - - - - -
0.2365 1460 0.8963 - - - - - - - - - - - - - -
0.2382 1470 0.9781 - - - - - - - - - - - - - -
0.2398 1480 0.8422 - - - - - - - - - - - - - -
0.2414 1490 0.8144 - - - - - - - - - - - - - -
0.2430 1500 0.7655 - - - - - - - - - - - - - -
0.2446 1510 0.6322 - - - - - - - - - - - - - -
0.2463 1520 0.6661 - - - - - - - - - - - - - -
0.2479 1530 0.7723 - - - - - - - - - - - - - -
0.2495 1540 0.7734 - - - - - - - - - - - - - -
0.2511 1550 0.8246 - - - - - - - - - - - - - -
0.2527 1560 0.7604 - - - - - - - - - - - - - -
0.2544 1570 0.8196 - - - - - - - - - - - - - -
0.2560 1580 0.7278 - - - - - - - - - - - - - -
0.2576 1590 0.7076 - - - - - - - - - - - - - -
0.2592 1600 0.6913 - - - - - - - - - - - - - -
0.2608 1610 0.6974 - - - - - - - - - - - - - -
0.2625 1620 0.7015 - - - - - - - - - - - - - -
0.2641 1630 0.677 - - - - - - - - - - - - - -
0.2657 1640 0.7185 - - - - - - - - - - - - - -
0.2673 1650 0.665 - - - - - - - - - - - - - -
0.2689 1660 0.7026 - - - - - - - - - - - - - -
0.2706 1670 0.6374 - - - - - - - - - - - - - -
0.2722 1680 0.652 - - - - - - - - - - - - - -
0.2738 1690 0.7426 - - - - - - - - - - - - - -
0.2754 1700 0.6444 - - - - - - - - - - - - - -
0.2770 1710 0.663 - - - - - - - - - - - - - -
0.2787 1720 0.6476 - - - - - - - - - - - - - -
0.2803 1730 0.6857 - - - - - - - - - - - - - -
0.2819 1740 0.6229 - - - - - - - - - - - - - -
0.2835 1750 0.5756 - - - - - - - - - - - - - -
0.2851 1760 0.6839 - - - - - - - - - - - - - -
0.2868 1770 0.8267 - - - - - - - - - - - - - -
0.2884 1780 0.8146 - - - - - - - - - - - - - -
0.2900 1790 0.7093 - - - - - - - - - - - - - -
0.2916 1800 0.7307 0.2597 0.2742 0.6859 0.2218 0.4912 0.2921 0.1728 0.3219 0.7381 0.2529 0.4898 0.4819 0.5037 0.3989
0.2932 1810 0.606 - - - - - - - - - - - - - -
0.2949 1820 0.6338 - - - - - - - - - - - - - -
0.2965 1830 0.5849 - - - - - - - - - - - - - -
0.2981 1840 0.699 - - - - - - - - - - - - - -
0.2997 1850 0.6164 - - - - - - - - - - - - - -
0.3013 1860 0.574 - - - - - - - - - - - - - -
0.3030 1870 0.5819 - - - - - - - - - - - - - -
0.3046 1880 0.5177 - - - - - - - - - - - - - -
0.3062 1890 0.6006 - - - - - - - - - - - - - -
0.3078 1900 0.6981 - - - - - - - - - - - - - -
0.3094 1910 0.885 - - - - - - - - - - - - - -
0.3111 1920 1.2742 - - - - - - - - - - - - - -
0.3127 1930 0.7133 - - - - - - - - - - - - - -
0.3143 1940 0.7271 - - - - - - - - - - - - - -
0.3159 1950 1.3258 - - - - - - - - - - - - - -
0.3175 1960 1.2689 - - - - - - - - - - - - - -
0.3192 1970 0.6723 - - - - - - - - - - - - - -
0.3208 1980 0.3596 - - - - - - - - - - - - - -
0.3224 1990 0.4078 - - - - - - - - - - - - - -
0.3240 2000 0.287 - - - - - - - - - - - - - -
0.3256 2010 0.2375 - - - - - - - - - - - - - -
0.3273 2020 0.2259 - - - - - - - - - - - - - -
0.3289 2030 0.3889 - - - - - - - - - - - - - -
0.3305 2040 0.7391 - - - - - - - - - - - - - -
0.3321 2050 0.5417 - - - - - - - - - - - - - -
0.3338 2060 0.4933 - - - - - - - - - - - - - -
0.3354 2070 0.426 - - - - - - - - - - - - - -
0.3370 2080 0.4222 - - - - - - - - - - - - - -
0.3386 2090 0.4132 - - - - - - - - - - - - - -
0.3402 2100 0.4133 - - - - - - - - - - - - - -
0.3419 2110 0.3989 - - - - - - - - - - - - - -
0.3435 2120 0.4035 - - - - - - - - - - - - - -
0.3451 2130 0.3804 - - - - - - - - - - - - - -
0.3467 2140 0.3597 - - - - - - - - - - - - - -
0.3483 2150 0.3793 - - - - - - - - - - - - - -
0.3500 2160 0.3633 - - - - - - - - - - - - - -
0.3516 2170 0.3504 - - - - - - - - - - - - - -
0.3532 2180 0.3475 - - - - - - - - - - - - - -
0.3548 2190 0.3467 - - - - - - - - - - - - - -
0.3564 2200 0.3412 - - - - - - - - - - - - - -
0.3581 2210 0.3665 - - - - - - - - - - - - - -
0.3597 2220 0.3585 - - - - - - - - - - - - - -
0.3613 2230 0.3335 - - - - - - - - - - - - - -
0.3629 2240 0.329 - - - - - - - - - - - - - -
0.3645 2250 0.3193 - - - - - - - - - - - - - -
0.3662 2260 0.3256 - - - - - - - - - - - - - -
0.3678 2270 0.325 - - - - - - - - - - - - - -
0.3694 2280 0.3312 - - - - - - - - - - - - - -
0.3710 2290 0.3323 - - - - - - - - - - - - - -
0.3726 2300 0.3192 - - - - - - - - - - - - - -
0.3743 2310 0.3366 - - - - - - - - - - - - - -
0.3759 2320 0.3247 - - - - - - - - - - - - - -
0.3775 2330 0.3207 - - - - - - - - - - - - - -
0.3791 2340 0.3238 - - - - - - - - - - - - - -
0.3807 2350 0.3217 - - - - - - - - - - - - - -
0.3824 2360 0.336 - - - - - - - - - - - - - -
0.3840 2370 0.3043 - - - - - - - - - - - - - -
0.3856 2380 0.3043 - - - - - - - - - - - - - -
0.3872 2390 0.3193 - - - - - - - - - - - - - -
0.3888 2400 0.3145 0.2338 0.4041 0.7329 0.2612 0.4511 0.3624 0.2742 0.3903 0.2020 0.2560 0.3127 0.5038 0.4262 0.3701
0.3905 2410 0.319 - - - - - - - - - - - - - -
0.3921 2420 0.3097 - - - - - - - - - - - - - -
0.3937 2430 0.2817 - - - - - - - - - - - - - -
0.3953 2440 0.3168 - - - - - - - - - - - - - -
0.3969 2450 0.2941 - - - - - - - - - - - - - -
0.3986 2460 0.2902 - - - - - - - - - - - - - -
0.4002 2470 0.3095 - - - - - - - - - - - - - -
0.4018 2480 0.3149 - - - - - - - - - - - - - -
0.4034 2490 0.2949 - - - - - - - - - - - - - -
0.4050 2500 0.3057 - - - - - - - - - - - - - -
0.4067 2510 0.2982 - - - - - - - - - - - - - -
0.4083 2520 0.3064 - - - - - - - - - - - - - -
0.4099 2530 0.3169 - - - - - - - - - - - - - -
0.4115 2540 0.2922 - - - - - - - - - - - - - -
0.4131 2550 0.2999 - - - - - - - - - - - - - -
0.4148 2560 0.2803 - - - - - - - - - - - - - -
0.4164 2570 0.3118 - - - - - - - - - - - - - -
0.4180 2580 0.309 - - - - - - - - - - - - - -
0.4196 2590 0.2894 - - - - - - - - - - - - - -
0.4212 2600 0.3126 - - - - - - - - - - - - - -
0.4229 2610 0.2949 - - - - - - - - - - - - - -
0.4245 2620 0.3204 - - - - - - - - - - - - - -
0.4261 2630 0.2868 - - - - - - - - - - - - - -
0.4277 2640 0.3168 - - - - - - - - - - - - - -
0.4293 2650 0.3245 - - - - - - - - - - - - - -
0.4310 2660 0.316 - - - - - - - - - - - - - -
0.4326 2670 0.2822 - - - - - - - - - - - - - -
0.4342 2680 0.3046 - - - - - - - - - - - - - -
0.4358 2690 0.2908 - - - - - - - - - - - - - -
0.4374 2700 0.2542 - - - - - - - - - - - - - -
0.4391 2710 0.3079 - - - - - - - - - - - - - -
0.4407 2720 0.2821 - - - - - - - - - - - - - -
0.4423 2730 0.2863 - - - - - - - - - - - - - -
0.4439 2740 0.2889 - - - - - - - - - - - - - -
0.4455 2750 0.282 - - - - - - - - - - - - - -
0.4472 2760 0.29 - - - - - - - - - - - - - -
0.4488 2770 0.2973 - - - - - - - - - - - - - -
0.4504 2780 0.3018 - - - - - - - - - - - - - -
0.4520 2790 0.2938 - - - - - - - - - - - - - -
0.4536 2800 0.2835 - - - - - - - - - - - - - -
0.4553 2810 0.2773 - - - - - - - - - - - - - -
0.4569 2820 0.2867 - - - - - - - - - - - - - -
0.4585 2830 0.2954 - - - - - - - - - - - - - -
0.4601 2840 0.3035 - - - - - - - - - - - - - -
0.4617 2850 0.2905 - - - - - - - - - - - - - -
0.4634 2860 0.2821 - - - - - - - - - - - - - -
0.4650 2870 0.2815 - - - - - - - - - - - - - -
0.4666 2880 0.298 - - - - - - - - - - - - - -
0.4682 2890 0.2905 - - - - - - - - - - - - - -
0.4698 2900 0.2821 - - - - - - - - - - - - - -
0.4715 2910 0.2904 - - - - - - - - - - - - - -
0.4731 2920 0.2992 - - - - - - - - - - - - - -
0.4747 2930 0.2834 - - - - - - - - - - - - - -
0.4763 2940 0.2855 - - - - - - - - - - - - - -
0.4779 2950 0.2775 - - - - - - - - - - - - - -
0.4796 2960 0.2994 - - - - - - - - - - - - - -
0.4812 2970 0.2939 - - - - - - - - - - - - - -
0.4828 2980 0.2999 - - - - - - - - - - - - - -
0.4844 2990 0.2935 - - - - - - - - - - - - - -
0.4860 3000 0.2714 0.2471 0.3962 0.7912 0.2469 0.4488 0.3739 0.2677 0.3976 0.1890 0.2485 0.2962 0.4538 0.4259 0.3679
0.4877 3010 0.2819 - - - - - - - - - - - - - -
0.4893 3020 0.2679 - - - - - - - - - - - - - -
0.4909 3030 0.2789 - - - - - - - - - - - - - -
0.4925 3040 0.2865 - - - - - - - - - - - - - -
0.4941 3050 0.2852 - - - - - - - - - - - - - -
0.4958 3060 0.2706 - - - - - - - - - - - - - -
0.4974 3070 0.2935 - - - - - - - - - - - - - -
0.4990 3080 0.272 - - - - - - - - - - - - - -
0.5006 3090 0.2915 - - - - - - - - - - - - - -
0.5022 3100 0.2826 - - - - - - - - - - - - - -
0.5039 3110 0.2652 - - - - - - - - - - - - - -
0.5055 3120 0.2887 - - - - - - - - - - - - - -
0.5071 3130 0.2613 - - - - - - - - - - - - - -
0.5087 3140 0.283 - - - - - - - - - - - - - -
0.5103 3150 0.2945 - - - - - - - - - - - - - -
0.5120 3160 0.2877 - - - - - - - - - - - - - -
0.5136 3170 0.2889 - - - - - - - - - - - - - -
0.5152 3180 0.268 - - - - - - - - - - - - - -
0.5168 3190 0.2911 - - - - - - - - - - - - - -
0.5184 3200 0.2785 - - - - - - - - - - - - - -
0.5201 3210 0.2711 - - - - - - - - - - - - - -
0.5217 3220 0.2911 - - - - - - - - - - - - - -
0.5233 3230 0.2649 - - - - - - - - - - - - - -
0.5249 3240 0.3054 - - - - - - - - - - - - - -
0.5265 3250 0.2531 - - - - - - - - - - - - - -
0.5282 3260 0.2767 - - - - - - - - - - - - - -
0.5298 3270 0.2853 - - - - - - - - - - - - - -
0.5314 3280 0.2731 - - - - - - - - - - - - - -
0.5330 3290 0.2776 - - - - - - - - - - - - - -
0.5346 3300 0.2725 - - - - - - - - - - - - - -
0.5363 3310 0.281 - - - - - - - - - - - - - -
0.5379 3320 0.2666 - - - - - - - - - - - - - -
0.5395 3330 0.2654 - - - - - - - - - - - - - -
0.5411 3340 0.2909 - - - - - - - - - - - - - -
0.5427 3350 0.2598 - - - - - - - - - - - - - -
0.5444 3360 0.2837 - - - - - - - - - - - - - -
0.5460 3370 0.2855 - - - - - - - - - - - - - -
0.5476 3380 0.2601 - - - - - - - - - - - - - -
0.5492 3390 0.268 - - - - - - - - - - - - - -
0.5508 3400 0.2681 - - - - - - - - - - - - - -
0.5525 3410 0.2663 - - - - - - - - - - - - - -
0.5541 3420 0.2837 - - - - - - - - - - - - - -
0.5557 3430 0.259 - - - - - - - - - - - - - -
0.5573 3440 0.2622 - - - - - - - - - - - - - -
0.5590 3450 0.2825 - - - - - - - - - - - - - -
0.5606 3460 0.2921 - - - - - - - - - - - - - -
0.5622 3470 0.2721 - - - - - - - - - - - - - -
0.5638 3480 0.2797 - - - - - - - - - - - - - -
0.5654 3490 0.2899 - - - - - - - - - - - - - -
0.5671 3500 0.2745 - - - - - - - - - - - - - -
0.5687 3510 0.2665 - - - - - - - - - - - - - -
0.5703 3520 0.2908 - - - - - - - - - - - - - -
0.5719 3530 0.2492 - - - - - - - - - - - - - -
0.5735 3540 0.2562 - - - - - - - - - - - - - -
0.5752 3550 0.2616 - - - - - - - - - - - - - -
0.5768 3560 0.2775 - - - - - - - - - - - - - -
0.5784 3570 0.2736 - - - - - - - - - - - - - -
0.5800 3580 0.2862 - - - - - - - - - - - - - -
0.5816 3590 0.2582 - - - - - - - - - - - - - -
0.5833 3600 0.2547 0.2371 0.3994 0.7786 0.2418 0.4072 0.3469 0.2615 0.4070 0.1551 0.2294 0.2533 0.4270 0.4161 0.3508
0.5849 3610 0.2822 - - - - - - - - - - - - - -
0.5865 3620 0.2622 - - - - - - - - - - - - - -
0.5881 3630 0.2691 - - - - - - - - - - - - - -
0.5897 3640 0.2585 - - - - - - - - - - - - - -
0.5914 3650 0.2927 - - - - - - - - - - - - - -
0.5930 3660 0.2593 - - - - - - - - - - - - - -
0.5946 3670 0.2501 - - - - - - - - - - - - - -
0.5962 3680 0.2796 - - - - - - - - - - - - - -
0.5978 3690 0.2622 - - - - - - - - - - - - - -
0.5995 3700 0.2508 - - - - - - - - - - - - - -
0.6011 3710 0.2891 - - - - - - - - - - - - - -
0.6027 3720 0.274 - - - - - - - - - - - - - -
0.6043 3730 0.2769 - - - - - - - - - - - - - -
0.6059 3740 0.2617 - - - - - - - - - - - - - -
0.6076 3750 0.2557 - - - - - - - - - - - - - -
0.6092 3760 0.2634 - - - - - - - - - - - - - -
0.6108 3770 0.262 - - - - - - - - - - - - - -
0.6124 3780 0.2696 - - - - - - - - - - - - - -
0.6140 3790 0.2608 - - - - - - - - - - - - - -
0.6157 3800 0.2592 - - - - - - - - - - - - - -
0.6173 3810 0.2757 - - - - - - - - - - - - - -
0.6189 3820 0.2672 - - - - - - - - - - - - - -
0.6205 3830 0.2523 - - - - - - - - - - - - - -
0.6221 3840 0.2775 - - - - - - - - - - - - - -
0.6238 3850 0.2621 - - - - - - - - - - - - - -
0.6254 3860 0.275 - - - - - - - - - - - - - -
0.6270 3870 0.2727 - - - - - - - - - - - - - -
0.6286 3880 0.2709 - - - - - - - - - - - - - -
0.6302 3890 0.2749 - - - - - - - - - - - - - -
0.6319 3900 0.2844 - - - - - - - - - - - - - -
0.6335 3910 0.2713 - - - - - - - - - - - - - -
0.6351 3920 0.2711 - - - - - - - - - - - - - -
0.6367 3930 0.2523 - - - - - - - - - - - - - -
0.6383 3940 0.2789 - - - - - - - - - - - - - -
0.6400 3950 0.2639 - - - - - - - - - - - - - -
0.6416 3960 0.2609 - - - - - - - - - - - - - -
0.6432 3970 0.2699 - - - - - - - - - - - - - -
0.6448 3980 0.2614 - - - - - - - - - - - - - -
0.6464 3990 0.2567 - - - - - - - - - - - - - -
0.6481 4000 1.2987 - - - - - - - - - - - - - -
0.6497 4010 1.4783 - - - - - - - - - - - - - -
0.6513 4020 1.7162 - - - - - - - - - - - - - -
0.6529 4030 1.2907 - - - - - - - - - - - - - -
0.6545 4040 1.2583 - - - - - - - - - - - - - -
0.6562 4050 1.0498 - - - - - - - - - - - - - -
0.6578 4060 1.8076 - - - - - - - - - - - - - -
0.6594 4070 1.215 - - - - - - - - - - - - - -
0.6610 4080 1.1462 - - - - - - - - - - - - - -
0.6626 4090 0.9511 - - - - - - - - - - - - - -
0.6643 4100 0.6151 - - - - - - - - - - - - - -
0.6659 4110 0.7482 - - - - - - - - - - - - - -
0.6675 4120 0.8572 - - - - - - - - - - - - - -
0.6691 4130 0.7722 - - - - - - - - - - - - - -
0.6707 4140 0.6085 - - - - - - - - - - - - - -
0.6724 4150 0.6644 - - - - - - - - - - - - - -
0.6740 4160 0.6423 - - - - - - - - - - - - - -
0.6756 4170 0.7482 - - - - - - - - - - - - - -
0.6772 4180 0.9649 - - - - - - - - - - - - - -
0.6788 4190 0.9205 - - - - - - - - - - - - - -
0.6805 4200 0.7746 0.2822 0.4484 0.7622 0.2944 0.5133 0.4592 0.2717 0.4451 0.3682 0.2594 0.2342 0.5123 0.4209 0.4055
0.6821 4210 0.5752 - - - - - - - - - - - - - -
0.6837 4220 0.6221 - - - - - - - - - - - - - -
0.6853 4230 0.526 - - - - - - - - - - - - - -
0.6869 4240 0.455 - - - - - - - - - - - - - -
0.6886 4250 0.4964 - - - - - - - - - - - - - -
0.6902 4260 0.935 - - - - - - - - - - - - - -
0.6918 4270 0.6227 - - - - - - - - - - - - - -
0.6934 4280 0.5594 - - - - - - - - - - - - - -
0.6950 4290 0.496 - - - - - - - - - - - - - -
0.6967 4300 0.5907 - - - - - - - - - - - - - -
0.6983 4310 0.5163 - - - - - - - - - - - - - -
0.6999 4320 0.468 - - - - - - - - - - - - - -
0.7015 4330 0.5214 - - - - - - - - - - - - - -
0.7031 4340 0.625 - - - - - - - - - - - - - -
0.7048 4350 0.593 - - - - - - - - - - - - - -
0.7064 4360 0.5852 - - - - - - - - - - - - - -
0.7080 4370 0.5648 - - - - - - - - - - - - - -
0.7096 4380 0.6791 - - - - - - - - - - - - - -
0.7112 4390 0.7008 - - - - - - - - - - - - - -
0.7129 4400 0.6731 - - - - - - - - - - - - - -
0.7145 4410 0.654 - - - - - - - - - - - - - -
0.7161 4420 0.6135 - - - - - - - - - - - - - -
0.7177 4430 0.6206 - - - - - - - - - - - - - -
0.7193 4440 0.5056 - - - - - - - - - - - - - -
0.7210 4450 0.5201 - - - - - - - - - - - - - -
0.7226 4460 0.5894 - - - - - - - - - - - - - -
0.7242 4470 0.5571 - - - - - - - - - - - - - -
0.7258 4480 0.5979 - - - - - - - - - - - - - -
0.7274 4490 0.6202 - - - - - - - - - - - - - -
0.7291 4500 0.5544 - - - - - - - - - - - - - -
0.7307 4510 0.6122 - - - - - - - - - - - - - -
0.7323 4520 0.5631 - - - - - - - - - - - - - -
0.7339 4530 0.5284 - - - - - - - - - - - - - -
0.7355 4540 0.6899 - - - - - - - - - - - - - -
0.7372 4550 0.5838 - - - - - - - - - - - - - -
0.7388 4560 0.6806 - - - - - - - - - - - - - -
0.7404 4570 0.5413 - - - - - - - - - - - - - -
0.7420 4580 0.5956 - - - - - - - - - - - - - -
0.7436 4590 0.6044 - - - - - - - - - - - - - -
0.7453 4600 0.5857 - - - - - - - - - - - - - -
0.7469 4610 0.5664 - - - - - - - - - - - - - -
0.7485 4620 0.5097 - - - - - - - - - - - - - -
0.7501 4630 0.4912 - - - - - - - - - - - - - -
0.7517 4640 0.6049 - - - - - - - - - - - - - -
0.7534 4650 0.5389 - - - - - - - - - - - - - -
0.7550 4660 0.555 - - - - - - - - - - - - - -
0.7566 4670 0.6238 - - - - - - - - - - - - - -
0.7582 4680 0.6447 - - - - - - - - - - - - - -
0.7598 4690 0.5606 - - - - - - - - - - - - - -
0.7615 4700 0.5165 - - - - - - - - - - - - - -
0.7631 4710 0.5839 - - - - - - - - - - - - - -
0.7647 4720 0.5189 - - - - - - - - - - - - - -
0.7663 4730 0.584 - - - - - - - - - - - - - -
0.7679 4740 0.5744 - - - - - - - - - - - - - -
0.7696 4750 0.5351 - - - - - - - - - - - - - -
0.7712 4760 0.5953 - - - - - - - - - - - - - -
0.7728 4770 0.5725 - - - - - - - - - - - - - -
0.7744 4780 0.5688 - - - - - - - - - - - - - -
0.7761 4790 0.5004 - - - - - - - - - - - - - -
0.7777 4800 0.5378 0.3514 0.4652 0.7429 0.3103 0.5406 0.4361 0.2797 0.4267 0.3843 0.2727 0.3474 0.5341 0.4249 0.4243
0.7793 4810 0.5244 - - - - - - - - - - - - - -
0.7809 4820 0.6241 - - - - - - - - - - - - - -
0.7825 4830 0.4844 - - - - - - - - - - - - - -
0.7842 4840 0.4401 - - - - - - - - - - - - - -
0.7858 4850 0.499 - - - - - - - - - - - - - -
0.7874 4860 0.5326 - - - - - - - - - - - - - -
0.7890 4870 0.4981 - - - - - - - - - - - - - -
0.7906 4880 0.5659 - - - - - - - - - - - - - -
0.7923 4890 0.5364 - - - - - - - - - - - - - -
0.7939 4900 0.5479 - - - - - - - - - - - - - -
0.7955 4910 0.4653 - - - - - - - - - - - - - -
0.7971 4920 0.5005 - - - - - - - - - - - - - -
0.7987 4930 0.5624 - - - - - - - - - - - - - -
0.8004 4940 0.4399 - - - - - - - - - - - - - -
0.8020 4950 0.4859 - - - - - - - - - - - - - -
0.8036 4960 0.5087 - - - - - - - - - - - - - -
0.8052 4970 0.511 - - - - - - - - - - - - - -
0.8068 4980 0.5819 - - - - - - - - - - - - - -
0.8085 4990 0.4462 - - - - - - - - - - - - - -
0.8101 5000 0.4882 - - - - - - - - - - - - - -
0.8117 5010 0.5306 - - - - - - - - - - - - - -
0.8133 5020 0.507 - - - - - - - - - - - - - -
0.8149 5030 0.4471 - - - - - - - - - - - - - -
0.8166 5040 0.5333 - - - - - - - - - - - - - -
0.8182 5050 0.4353 - - - - - - - - - - - - - -
0.8198 5060 0.5615 - - - - - - - - - - - - - -
0.8214 5070 0.5629 - - - - - - - - - - - - - -
0.8230 5080 0.5131 - - - - - - - - - - - - - -
0.8247 5090 0.4789 - - - - - - - - - - - - - -
0.8263 5100 0.4934 - - - - - - - - - - - - - -
0.8279 5110 0.5285 - - - - - - - - - - - - - -
0.8295 5120 0.4414 - - - - - - - - - - - - - -
0.8311 5130 0.5262 - - - - - - - - - - - - - -
0.8328 5140 0.4645 - - - - - - - - - - - - - -
0.8344 5150 0.4532 - - - - - - - - - - - - - -
0.8360 5160 0.4421 - - - - - - - - - - - - - -
0.8376 5170 0.4375 - - - - - - - - - - - - - -
0.8392 5180 0.5234 - - - - - - - - - - - - - -
0.8409 5190 0.4803 - - - - - - - - - - - - - -
0.8425 5200 0.4872 - - - - - - - - - - - - - -
0.8441 5210 0.451 - - - - - - - - - - - - - -
0.8457 5220 0.4388 - - - - - - - - - - - - - -
0.8473 5230 0.5182 - - - - - - - - - - - - - -
0.8490 5240 0.5302 - - - - - - - - - - - - - -
0.8506 5250 0.4643 - - - - - - - - - - - - - -
0.8522 5260 0.5581 - - - - - - - - - - - - - -
0.8538 5270 0.4643 - - - - - - - - - - - - - -
0.8554 5280 0.5288 - - - - - - - - - - - - - -
0.8571 5290 0.4133 - - - - - - - - - - - - - -
0.8587 5300 0.4664 - - - - - - - - - - - - - -
0.8603 5310 0.4814 - - - - - - - - - - - - - -
0.8619 5320 0.5256 - - - - - - - - - - - - - -
0.8635 5330 0.4904 - - - - - - - - - - - - - -
0.8652 5340 0.4495 - - - - - - - - - - - - - -
0.8668 5350 0.5389 - - - - - - - - - - - - - -
0.8684 5360 0.4497 - - - - - - - - - - - - - -
0.8700 5370 0.4776 - - - - - - - - - - - - - -
0.8716 5380 0.5441 - - - - - - - - - - - - - -
0.8733 5390 0.4473 - - - - - - - - - - - - - -
0.8749 5400 0.5598 0.3381 0.4668 0.7306 0.3137 0.5415 0.4550 0.2840 0.4169 0.4719 0.2735 0.3582 0.5311 0.4076 0.4299
0.8765 5410 0.4726 - - - - - - - - - - - - - -
0.8781 5420 0.4966 - - - - - - - - - - - - - -
0.8797 5430 0.4644 - - - - - - - - - - - - - -
0.8814 5440 0.4084 - - - - - - - - - - - - - -
0.8830 5450 0.4913 - - - - - - - - - - - - - -
0.8846 5460 0.5708 - - - - - - - - - - - - - -
0.8862 5470 0.5577 - - - - - - - - - - - - - -
0.8878 5480 0.4839 - - - - - - - - - - - - - -
0.8895 5490 0.461 - - - - - - - - - - - - - -
0.8911 5500 0.4799 - - - - - - - - - - - - - -
0.8927 5510 0.5608 - - - - - - - - - - - - - -
0.8943 5520 0.4625 - - - - - - - - - - - - - -
0.8959 5530 0.4765 - - - - - - - - - - - - - -
0.8976 5540 0.4348 - - - - - - - - - - - - - -
0.8992 5550 0.4424 - - - - - - - - - - - - - -
0.9008 5560 0.4147 - - - - - - - - - - - - - -
0.9024 5570 0.433 - - - - - - - - - - - - - -
0.9040 5580 0.4628 - - - - - - - - - - - - - -
0.9057 5590 0.4466 - - - - - - - - - - - - - -
0.9073 5600 0.4563 - - - - - - - - - - - - - -
0.9089 5610 0.4508 - - - - - - - - - - - - - -
0.9105 5620 0.4619 - - - - - - - - - - - - - -
0.9121 5630 0.4264 - - - - - - - - - - - - - -
0.9138 5640 0.5157 - - - - - - - - - - - - - -
0.9154 5650 0.4721 - - - - - - - - - - - - - -
0.9170 5660 0.4518 - - - - - - - - - - - - - -
0.9186 5670 0.4101 - - - - - - - - - - - - - -
0.9202 5680 0.4092 - - - - - - - - - - - - - -
0.9219 5690 0.4042 - - - - - - - - - - - - - -
0.9235 5700 0.3852 - - - - - - - - - - - - - -
0.9251 5710 0.375 - - - - - - - - - - - - - -
0.9267 5720 0.3548 - - - - - - - - - - - - - -
0.9283 5730 0.3461 - - - - - - - - - - - - - -
0.9300 5740 0.3396 - - - - - - - - - - - - - -
0.9316 5750 0.3465 - - - - - - - - - - - - - -
0.9332 5760 0.347 - - - - - - - - - - - - - -
0.9348 5770 0.3365 - - - - - - - - - - - - - -
0.9364 5780 0.3299 - - - - - - - - - - - - - -
0.9381 5790 0.3417 - - - - - - - - - - - - - -
0.9397 5800 0.3423 - - - - - - - - - - - - - -
0.9413 5810 0.3512 - - - - - - - - - - - - - -
0.9429 5820 0.3353 - - - - - - - - - - - - - -
0.9445 5830 0.3291 - - - - - - - - - - - - - -
0.9462 5840 0.3162 - - - - - - - - - - - - - -
0.9478 5850 0.3326 - - - - - - - - - - - - - -
0.9494 5860 0.345 - - - - - - - - - - - - - -
0.9510 5870 0.2998 - - - - - - - - - - - - - -
0.9526 5880 0.307 - - - - - - - - - - - - - -
0.9543 5890 0.3019 - - - - - - - - - - - - - -
0.9559 5900 0.3169 - - - - - - - - - - - - - -
0.9575 5910 0.2857 - - - - - - - - - - - - - -
0.9591 5920 0.3018 - - - - - - - - - - - - - -
0.9607 5930 0.2954 - - - - - - - - - - - - - -
0.9624 5940 0.2953 - - - - - - - - - - - - - -
0.9640 5950 0.2861 - - - - - - - - - - - - - -
0.9656 5960 0.3384 - - - - - - - - - - - - - -
0.9672 5970 0.2968 - - - - - - - - - - - - - -
0.9688 5980 0.3191 - - - - - - - - - - - - - -
0.9705 5990 0.3069 - - - - - - - - - - - - - -
0.9721 6000 0.3025 0.3322 0.4606 0.6623 0.3084 0.5552 0.4463 0.2714 0.4404 0.7084 0.2888 0.3529 0.4924 0.4138 0.4410
0.9737 6010 0.2891 - - - - - - - - - - - - - -
0.9753 6020 0.3038 - - - - - - - - - - - - - -
0.9769 6030 0.2931 - - - - - - - - - - - - - -
0.9786 6040 0.3145 - - - - - - - - - - - - - -
0.9802 6050 0.3046 - - - - - - - - - - - - - -
0.9818 6060 0.2896 - - - - - - - - - - - - - -
0.9834 6070 0.2926 - - - - - - - - - - - - - -
0.9850 6080 0.3025 - - - - - - - - - - - - - -
0.9867 6090 0.2798 - - - - - - - - - - - - - -
0.9883 6100 0.3006 - - - - - - - - - - - - - -
0.9899 6110 0.2695 - - - - - - - - - - - - - -
0.9915 6120 0.3017 - - - - - - - - - - - - - -
0.9931 6130 0.2955 - - - - - - - - - - - - - -
0.9948 6140 0.2699 - - - - - - - - - - - - - -
0.9964 6150 0.2955 - - - - - - - - - - - - - -
0.9980 6160 0.2963 - - - - - - - - - - - - - -
0.9996 6170 0.2988 - - - - - - - - - - - - - -

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.3.0
  • Transformers: 4.46.2
  • PyTorch: 2.1.1+cu121
  • Accelerate: 0.34.2
  • Datasets: 3.1.0
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
53
Safetensors
Model size
494M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for AlexWortega/qwen_emb_6k

Base model

Qwen/Qwen2.5-0.5B
Finetuned
(92)
this model

Evaluation results