Edit model card

SentenceTransformer based on intfloat/multilingual-e5-small

This is a sentence-transformers model finetuned from intfloat/multilingual-e5-small. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: intfloat/multilingual-e5-small
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 384 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Tekkla/TripletLoss_flores_kaen")
# Run inference
sentences = [
    'ლიგანდების კოორდინაციული ბუნება შესწავლილია ინფრაწითელი სპექტროსკოპიული და რენტგენოგრაფიული მეთოდებით.',
    'The coordination character of cyanate ion has been studied by the methods of infrared spectra and X-ray.',
    'The Applicants argued that declaration of unconstitutionality of a normative act by the Constitutional Court shall be followed by efficient legal consequences.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 24,034 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 39.79 tokens
    • max: 170 tokens
    • min: 8 tokens
    • mean: 32.92 tokens
    • max: 133 tokens
    • min: 8 tokens
    • mean: 36.72 tokens
    • max: 154 tokens
  • Samples:
    anchor positive negative
    1979 წელს ის პირობით გაათავისუფლეს. He was released on licence in 1979. ფსიქოზის გავრცელების ხარისხი აჩვენებს წრფივ კორელაციას ურბანიზაციის ხარისხთან.
    ვეტერინარულ კონტროლს დაქვემდებარებული საქონლის ექსპორტისას - სერტიფიკატის წარდგენა სავალდებულოა მხოლოდ: When exporting the goods subject to veterinary control - it is mandatory to provide a certificate only: The Role of Terrestrial Mollusks in Propagation of Trematodes in Urban Environment.
    ბელა, ხომ კარგად ხარ? – Bella, are you okay? • to gain feedback on leading questions;
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.EUCLIDEAN",
        "triplet_margin": 5
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 3,005 evaluation samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 8 tokens
    • mean: 38.7 tokens
    • max: 138 tokens
    • min: 8 tokens
    • mean: 31.89 tokens
    • max: 96 tokens
    • min: 8 tokens
    • mean: 36.32 tokens
    • max: 95 tokens
  • Samples:
    anchor positive negative
    3. თუ გადასახადის გადამხდელი იღებს ან მას უფლება აქვს, მიიღოს შემოსავალი პროცენტის სახით ან ქონების იჯარით გადაცემით, შემოსავალი სავალო ვალდებულების ან იჯარის ხელშეკრულების ვადის გასვლის მომენტში მიღებულად ითვლება. 3. If a taxpayer earns or has the right to earn income in the form of interest or from leasing property, the income shall be deemed to have been obtained at the moment when the debt obligation or lease agreement expires. In, Cd და Bi დაცილება ანიონიტ AB–17-ის OH′-ფორმაზე დალექვითი ქრომატოგრაფიის მეთოდით.
    პროფესიონალიზმის მაღალი ხარისხი ნიშნავს, რომ ჟურნალისტიკა, როგორც ინსტიტუტი, დიფერენცირებულია და სხვა ინსტიტუტებისგან განსხვავებული პრაქტიკა აქვს, მათ შორის, პოლიტიკის ჩათვლით. A high degree of professionalization of journalism means that journalism is differentiated as an institution and form of practice from other institutions and forms of practice – including politics. ჯანმრთელობის დაცვა და სოციალური დახმარება, კომუნალური, სოციალური და პერსონალური მომსახურების გაწევა.
    ამგვარად, მსგავს შემთხვევებში შეიძლება საჭირო იყოს დამატებითი ფრაზები, რათა თავიდან იქნეს აცილებული ისე წარმოჩენა, თითქოს მარწმუნებელ ანგარიშში ნაგულისხმევია, რომ პრაქტიკოსის პასუხისმგებლობა გამოთქმულ დასკვნაზე შემცირებულია ექსპერტის ჩართულობის გამო. Therefore, additional wording may be needed in such cases to prevent the assurance report implying that the practitioner’s responsibility for the conclusion expressed is reduced because of the involvement of the expert. სმენის პროთეზირება მრგვალი სარკმლის ეკრანირებისათვის ფოროვანი ელასტომერის და მეტალის ფირფიტის გამოყენებით.
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.EUCLIDEAN",
        "triplet_margin": 5
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 2
  • learning_rate: 0.0001
  • num_train_epochs: 10
  • warmup_steps: 1000
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 2
  • eval_accumulation_steps: None
  • learning_rate: 0.0001
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 1000
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss loss
0.0133 10 4.7952 -
0.0266 20 4.7856 -
0.0399 30 4.7634 -
0.0532 40 4.7186 -
0.0665 50 4.6771 -
0.0798 60 4.6085 -
0.0931 70 4.4944 -
0.1065 80 4.3714 -
0.1198 90 4.2601 -
0.1331 100 4.2006 4.1392
0.1464 110 4.1937 -
0.1597 120 4.1503 -
0.1730 130 4.1355 -
0.1863 140 4.1164 -
0.1996 150 4.0822 -
0.2129 160 4.0613 -
0.2262 170 4.0549 -
0.2395 180 4.0938 -
0.2528 190 3.9957 -
0.2661 200 4.0573 3.9721
0.2794 210 4.0657 -
0.2927 220 4.0191 -
0.3061 230 4.0222 -
0.3194 240 4.0265 -
0.3327 250 4.0407 -
0.3460 260 3.997 -
0.3593 270 3.9782 -
0.3726 280 3.9818 -
0.3859 290 3.9965 -
0.3992 300 3.989 3.9337
0.4125 310 3.9439 -
0.4258 320 4.0057 -
0.4391 330 3.9681 -
0.4524 340 3.9903 -
0.4657 350 3.9816 -
0.4790 360 3.9776 -
0.4923 370 3.9555 -
0.5057 380 3.9927 -
0.5190 390 3.9753 -
0.5323 400 3.9917 3.9099
0.5456 410 3.9693 -
0.5589 420 3.9546 -
0.5722 430 3.9701 -
0.5855 440 3.9558 -
0.5988 450 3.9677 -
0.6121 460 3.953 -
0.6254 470 3.9279 -
0.6387 480 3.982 -
0.6520 490 3.9113 -
0.6653 500 3.9419 3.8756
0.6786 510 3.8882 -
0.6919 520 3.9268 -
0.7053 530 3.9446 -
0.7186 540 3.8975 -
0.7319 550 3.939 -
0.7452 560 3.9551 -
0.7585 570 3.931 -
0.7718 580 3.9403 -
0.7851 590 3.9375 -
0.7984 600 3.9305 3.8727
0.8117 610 3.9354 -
0.8250 620 3.9104 -
0.8383 630 3.9487 -
0.8516 640 3.9716 -
0.8649 650 3.9227 -
0.8782 660 3.9487 -
0.8916 670 3.9278 -
0.9049 680 3.9275 -
0.9182 690 3.9496 -
0.9315 700 3.9178 3.8614
0.9448 710 3.9015 -
0.9581 720 3.984 -
0.9714 730 3.917 -
0.9847 740 3.9371 -
0.9980 750 3.9106 -
1.0113 760 3.892 -
1.0246 770 3.8854 -
1.0379 780 3.9142 -
1.0512 790 3.9096 -
1.0645 800 3.9099 3.8635
1.0778 810 3.9599 -
1.0912 820 3.9025 -
1.1045 830 3.888 -
1.1178 840 3.8837 -
1.1311 850 3.9253 -
1.1444 860 3.9419 -
1.1577 870 3.8841 -
1.1710 880 3.9644 -
1.1843 890 3.9211 -
1.1976 900 3.9088 3.8651
1.2109 910 3.9024 -
1.2242 920 3.9129 -
1.2375 930 4.0027 -
1.2508 940 3.9038 -
1.2641 950 3.8736 -
1.2774 960 3.9454 -
1.2908 970 3.9104 -
1.3041 980 3.9552 -
1.3174 990 3.9194 -
1.3307 1000 3.9635 3.8888
1.3440 1010 3.8538 -
1.3573 1020 3.8927 -
1.3706 1030 3.8978 -
1.3839 1040 3.9293 -
1.3972 1050 3.8962 -
1.4105 1060 3.8857 -
1.4238 1070 3.9146 -
1.4371 1080 3.8997 -
1.4504 1090 3.9347 -
1.4637 1100 3.9239 3.8753
1.4770 1110 3.9165 -
1.4904 1120 3.8733 -
1.5037 1130 3.8981 -
1.5170 1140 3.8948 -
1.5303 1150 3.9131 -
1.5436 1160 3.8931 -
1.5569 1170 3.9122 -
1.5702 1180 3.8837 -
1.5835 1190 3.8917 -
1.5968 1200 3.9078 3.9019
1.6101 1210 3.9066 -
1.6234 1220 3.911 -
1.6367 1230 3.9278 -
1.6500 1240 3.8323 -
1.6633 1250 3.8966 -
1.6766 1260 3.9212 -
1.6900 1270 3.8609 -
1.7033 1280 3.8928 -
1.7166 1290 3.8495 -
1.7299 1300 3.8748 3.8766
1.7432 1310 3.9214 -
1.7565 1320 3.8944 -
1.7698 1330 3.9011 -
1.7831 1340 3.8986 -
1.7964 1350 3.8911 -
1.8097 1360 3.8789 -
1.8230 1370 3.8749 -
1.8363 1380 3.8835 -
1.8496 1390 3.9067 -
1.8629 1400 3.9141 3.8553
1.8762 1410 3.9095 -
1.8896 1420 3.8742 -
1.9029 1430 3.8965 -
1.9162 1440 3.91 -
1.9295 1450 3.8745 -
1.9428 1460 3.8642 -
1.9561 1470 3.9136 -
1.9694 1480 3.8681 -
1.9827 1490 3.8942 -
1.9960 1500 3.8332 3.8629
2.0093 1510 3.8361 -
2.0226 1520 3.872 -
2.0359 1530 3.8742 -
2.0492 1540 3.8621 -
2.0625 1550 3.8804 -
2.0758 1560 3.8928 -
2.0892 1570 3.8203 -
2.1025 1580 3.7907 -
2.1158 1590 3.85 -
2.1291 1600 3.823 3.8559
2.1424 1610 3.8706 -
2.1557 1620 3.8681 -
2.1690 1630 3.8459 -
2.1823 1640 3.8592 -
2.1956 1650 3.8635 -
2.2089 1660 3.8668 -
2.2222 1670 3.8677 -
2.2355 1680 3.8798 -
2.2488 1690 3.8385 -
2.2621 1700 3.8293 3.8560
2.2754 1710 3.8508 -
2.2888 1720 3.8703 -
2.3021 1730 3.8749 -
2.3154 1740 3.8837 -
2.3287 1750 3.8855 -
2.3420 1760 3.8291 -
2.3553 1770 3.8449 -
2.3686 1780 3.8325 -
2.3819 1790 3.8719 -
2.3952 1800 3.8141 3.8731
2.4085 1810 3.8325 -
2.4218 1820 3.8812 -
2.4351 1830 3.8565 -
2.4484 1840 3.8644 -
2.4617 1850 3.8812 -
2.4750 1860 3.869 -
2.4884 1870 3.8284 -
2.5017 1880 3.8615 -
2.5150 1890 3.8223 -
2.5283 1900 3.8676 3.8441
2.5416 1910 3.8528 -
2.5549 1920 3.8715 -
2.5682 1930 3.856 -
2.5815 1940 3.8192 -
2.5948 1950 3.8814 -
2.6081 1960 3.8194 -
2.6214 1970 3.8343 -
2.6347 1980 3.846 -
2.6480 1990 3.8926 -
2.6613 2000 3.8404 3.8484
2.6747 2010 3.816 -
2.6880 2020 3.8457 -
2.7013 2030 3.8496 -
2.7146 2040 3.8099 -
2.7279 2050 3.8689 -
2.7412 2060 3.849 -
2.7545 2070 3.8404 -
2.7678 2080 3.8555 -
2.7811 2090 3.878 -
2.7944 2100 3.8175 3.8656
2.8077 2110 3.8551 -
2.8210 2120 3.8031 -
2.8343 2130 3.8679 -
2.8476 2140 3.8591 -
2.8609 2150 3.8395 -
2.8743 2160 3.8368 -
2.8876 2170 3.8351 -
2.9009 2180 3.8646 -
2.9142 2190 3.8841 -
2.9275 2200 3.8473 3.8684
2.9408 2210 3.8345 -
2.9541 2220 3.845 -
2.9674 2230 3.8374 -
2.9807 2240 3.8252 -
2.9940 2250 3.7778 -
3.0073 2260 3.7963 -
3.0206 2270 3.8533 -
3.0339 2280 3.8338 -
3.0472 2290 3.8037 -
3.0605 2300 3.789 3.8640
3.0739 2310 3.8344 -
3.0872 2320 3.8114 -
3.1005 2330 3.7935 -
3.1138 2340 3.7721 -
3.1271 2350 3.8016 -
3.1404 2360 3.8206 -
3.1537 2370 3.8103 -
3.1670 2380 3.8053 -
3.1803 2390 3.8356 -
3.1936 2400 3.8245 3.8609
3.2069 2410 3.8099 -
3.2202 2420 3.8413 -
3.2335 2430 3.8133 -
3.2468 2440 3.8218 -
3.2601 2450 3.8258 -
3.2735 2460 3.7975 -
3.2868 2470 3.8513 -
3.3001 2480 3.7996 -
3.3134 2490 3.8503 -
3.3267 2500 3.7947 3.8511
3.3400 2510 3.7984 -
3.3533 2520 3.8075 -
3.3666 2530 3.8049 -
3.3799 2540 3.8186 -
3.3932 2550 3.7944 -
3.4065 2560 3.8104 -
3.4198 2570 3.817 -
3.4331 2580 3.8052 -
3.4464 2590 3.8233 -
3.4597 2600 3.8671 3.8738
3.4731 2610 3.824 -
3.4864 2620 3.8215 -
3.4997 2630 3.8113 -
3.5130 2640 3.7831 -
3.5263 2650 3.8616 -
3.5396 2660 3.8325 -
3.5529 2670 3.8189 -
3.5662 2680 3.865 -
3.5795 2690 3.7572 -
3.5928 2700 3.8308 3.8531
3.6061 2710 3.7959 -
3.6194 2720 3.8129 -
3.6327 2730 3.8402 -
3.6460 2740 3.8114 -
3.6593 2750 3.7955 -
3.6727 2760 3.8054 -
3.6860 2770 3.7986 -
3.6993 2780 3.7911 -
3.7126 2790 3.8203 -
3.7259 2800 3.7763 3.8455
3.7392 2810 3.8178 -
3.7525 2820 3.8654 -
3.7658 2830 3.8132 -
3.7791 2840 3.8255 -
3.7924 2850 3.7809 -
3.8057 2860 3.8175 -
3.8190 2870 3.7677 -
3.8323 2880 3.8271 -
3.8456 2890 3.8145 -
3.8589 2900 3.8025 3.8522
3.8723 2910 3.787 -
3.8856 2920 3.8068 -
3.8989 2930 3.8305 -
3.9122 2940 3.849 -
3.9255 2950 3.7765 -
3.9388 2960 3.8451 -
3.9521 2970 3.8468 -
3.9654 2980 3.8188 -
3.9787 2990 3.7912 -
3.9920 3000 3.7558 3.8499
4.0053 3010 3.7498 -
4.0186 3020 3.8196 -
4.0319 3030 3.8121 -
4.0452 3040 3.7971 -
4.0585 3050 3.7756 -
4.0719 3060 3.7782 -
4.0852 3070 3.7915 -
4.0985 3080 3.782 -
4.1118 3090 3.7506 -
4.1251 3100 3.782 3.8648
4.1384 3110 3.7541 -
4.1517 3120 3.8093 -
4.1650 3130 3.7708 -
4.1783 3140 3.8064 -
4.1916 3150 3.7941 -
4.2049 3160 3.7623 -
4.2182 3170 3.8032 -
4.2315 3180 3.7828 -
4.2448 3190 3.8005 -
4.2582 3200 3.7736 3.8566
4.2715 3210 3.7538 -
4.2848 3220 3.8005 -
4.2981 3230 3.7946 -
4.3114 3240 3.8061 -
4.3247 3250 3.7911 -
4.3380 3260 3.7947 -
4.3513 3270 3.7622 -
4.3646 3280 3.7866 -
4.3779 3290 3.7812 -
4.3912 3300 3.7575 3.8530
4.4045 3310 3.7578 -
4.4178 3320 3.7521 -
4.4311 3330 3.7863 -
4.4444 3340 3.7835 -
4.4578 3350 3.8357 -
4.4711 3360 3.796 -
4.4844 3370 3.7951 -
4.4977 3380 3.7668 -
4.5110 3390 3.7735 -
4.5243 3400 3.7996 3.8634
4.5376 3410 3.7848 -
4.5509 3420 3.7763 -
4.5642 3430 3.7953 -
4.5775 3440 3.7485 -
4.5908 3450 3.793 -
4.6041 3460 3.7641 -
4.6174 3470 3.7535 -
4.6307 3480 3.7975 -
4.6440 3490 3.81 -
4.6574 3500 3.7288 3.8684
4.6707 3510 3.8165 -
4.6840 3520 3.7747 -
4.6973 3530 3.7402 -
4.7106 3540 3.7528 -
4.7239 3550 3.7532 -
4.7372 3560 3.7766 -
4.7505 3570 3.8459 -
4.7638 3580 3.785 -
4.7771 3590 3.8026 -
4.7904 3600 3.7801 3.8470
4.8037 3610 3.7737 -
4.8170 3620 3.7665 -
4.8303 3630 3.8046 -
4.8436 3640 3.757 -
4.8570 3650 3.7978 -
4.8703 3660 3.779 -
4.8836 3670 3.7528 3.8492

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.42.4
  • PyTorch: 2.3.1+cu121
  • Accelerate: 0.32.1
  • Datasets: 2.20.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

TripletLoss

@misc{hermans2017defense,
    title={In Defense of the Triplet Loss for Person Re-Identification}, 
    author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
    year={2017},
    eprint={1703.07737},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}
Downloads last month
8
Safetensors
Model size
118M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Tekkla/TripletLoss_flores_kaen

Finetuned
(56)
this model