SentenceTransformer based on am-azadi/UAE-Large-V1_Fine_Tuned
This is a sentence-transformers model finetuned from am-azadi/UAE-Large-V1_Fine_Tuned. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: am-azadi/UAE-Large-V1_Fine_Tuned
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'Look what a show Pope Francis gave in yesterday\'s homily / sermon! It\'s to be read and reread over and over again... This is the most spiritual Pope since Peter. "You may have flaws, be anxious, and sometimes live irritated, but don\'t forget that your life is the greatest company in the world. Only you can prevent it from going into decline. Many appreciate you, admire you and love you. remember that being happy is not having a sky without storms, a road without accidents, work without fatigue, relationships without disappointments. Being happy is finding strength in forgiveness, hope in battles, security in the stage of fear, love in discord. It\'s not just appreciating the smile, but also reflecting on sadness. It\'s not just celebrating successes, but learning lessons from failures. It\'s not just feeling happy with applause, but being happy in anonymity. Being happy is recognizing that it\'s worth life is worth living, despite all the challenges, misunderstandings, periods of crisis. Being happy is not a fatality of fate, but an achievement for those who manage to travel within themselves. To be happy is to stop feeling like a victim of problems and become the author of his own story . It\'s crossing deserts outside of yourself, but managing to find an oasis in the depths of our soul. It is to thank God for each morning, for the miracle of life. Being happy is not being afraid of your own feelings. It\'s knowing how to talk about yourself. It\'s having the courage to hear a "no". It\'s feeling safe when receiving criticism, even if unfair. It\'s kissing the children, pampering the parents, living poetic moments with friends, even when they hurt us. To be happy is to let the creature that lives in each of us live, free, joyful and simple. It\'s having maturity to be able to say: "I was wrong". It\'s having the courage to say, "I\'m sorry". It\'s having the sensitivity to say: "I need you". It\'s having the ability to say, "I love you". May your life become a garden of opportunities to be happy... May your springtime be a lover of joy. May in your winters be a lover of wisdom. And that when you make a mistake, start over from the beginning. For only then will you be in love with life. You will discover that being happy is not having a perfect life. But using tears to irrigate tolerance. Use losses to train patience. Using mistakes to sculpt serenity. Using pain to cut pleasure. Use obstacles to open intelligence windows. Never give up....Never give up on the people who love you. Never give up happiness, for life is an incredible spectacle." (Pope Francis).',
'"The message that Pope Francis gave in yesterday\'s homily/sermon! It is to be read and reread several times... What an admirable man!"',
'Denmark allows Muslim women to wear the niqab',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 25,743 training samples
- Columns:
sentence_0
,sentence_1
, andlabel
- Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 label type string string float details - min: 6 tokens
- mean: 113.55 tokens
- max: 512 tokens
- min: 4 tokens
- mean: 17.89 tokens
- max: 126 tokens
- min: 1.0
- mean: 1.0
- max: 1.0
- Samples:
sentence_0 sentence_1 label de. has left the route with his the ultimate nazi prey dogs. With his patches with swastikas of Tito Adolfito Hitler and those little things. But don't tell them fucking Nazis, they'll be offended later 180 * Rov
Santiago Abascal posed next to a man wearing Nazi emblems at a biker rally in Valladolid
1.0
the info is ... a Danish police officer told a woman who was wearing a veil that the parliament had decided to approve the use of the niqab for Muslim women in Denmark. The situation is reversed in Indonesia, which (he said) has a Muslim majority population. Aya naon with the country ieu?
Denmark allows Muslim women to wear the niqab
1.0
In Kolwezi, a Congolese driver destroys Chinese trucks out of anger
A Congolese destroys the trucks of a Chinese company in the DRC
1.0
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
per_device_train_batch_size
: 2per_device_eval_batch_size
: 2num_train_epochs
: 1multi_dataset_batch_sampler
: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: noprediction_loss_only
: Trueper_device_train_batch_size
: 2per_device_eval_batch_size
: 2per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: round_robin
Training Logs
Epoch | Step | Training Loss |
---|---|---|
0.0388 | 500 | 0.0173 |
0.0777 | 1000 | 0.0124 |
0.1165 | 1500 | 0.0127 |
0.1554 | 2000 | 0.0256 |
0.1942 | 2500 | 0.0123 |
0.2331 | 3000 | 0.0199 |
0.2719 | 3500 | 0.0079 |
0.3108 | 4000 | 0.0134 |
0.3496 | 4500 | 0.0127 |
0.3884 | 5000 | 0.026 |
0.4273 | 5500 | 0.0314 |
0.4661 | 6000 | 0.0267 |
0.5050 | 6500 | 0.0145 |
0.5438 | 7000 | 0.0093 |
0.5827 | 7500 | 0.007 |
0.6215 | 8000 | 0.0071 |
0.6603 | 8500 | 0.0116 |
0.6992 | 9000 | 0.0085 |
0.7380 | 9500 | 0.0157 |
0.7769 | 10000 | 0.0051 |
0.8157 | 10500 | 0.0101 |
0.8546 | 11000 | 0.0174 |
0.8934 | 11500 | 0.0116 |
0.9323 | 12000 | 0.0073 |
0.9711 | 12500 | 0.0146 |
Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.4.1
- Transformers: 4.48.3
- PyTorch: 2.5.1+cu124
- Accelerate: 1.3.0
- Datasets: 3.3.1
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for am-azadi/UAE-Large-V1_Fine_Tuned_2e
Base model
WhereIsAI/UAE-Large-V1
Finetuned
am-azadi/UAE-Large-V1_Fine_Tuned