Bofandra's picture
Add new SentenceTransformer model.
46d1967 verified
metadata
base_model: Bofandra/fine-tuning-use-cmlm-multilingual-quran
datasets: []
language: []
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:6225
  - loss:MegaBatchMarginLoss
widget:
  - source_sentence: >-
      يا أيها الذين آمنوا لا تتخذوا الكافرين أولياء من دون المؤمنين أتريدون أن
      تجعلوا لله عليكم سلطانا مبينا
    sentences:
      - >-
        And when he attained his full strength and was [mentally] mature, We
        bestowed upon him judgement and knowledge. And thus do We reward the
        doers of good.
      - Then Moses threw his staff, and at once it devoured what they falsified.
      - >-
        O you who have believed, do not take the disbelievers as allies instead
        of the believers. Do you wish to give Allah against yourselves a clear
        case?
  - source_sentence: قال لم أكن لأسجد لبشر خلقته من صلصال من حمإ مسنون
    sentences:
      - And We left it as a sign, so is there any who will remember?
      - Gardens of perpetual residence, whose doors will be opened to them.
      - >-
        He said, "Never would I prostrate to a human whom You created out of
        clay from an altered black mud."
  - source_sentence: وسخر لكم الشمس والقمر دائبين وسخر لكم الليل والنهار
    sentences:
      - >-
        And He subjected for you the sun and the moon, continuous [in orbit],
        and subjected for you the night and the day.
      - >-
        And We called him from the side of the mount at [his] right and brought
        him near, confiding [to him].
      - >-
        And We send not the messengers except as bringers of good tidings and
        warners. And those who disbelieve dispute by [using] falsehood to
        [attempt to] invalidate thereby the truth and have taken My verses, and
        that of which they are warned, in ridicule.
  - source_sentence: إذ دخلوا عليه فقالوا سلاما قال إنا منكم وجلون
    sentences:
      - >-
        Indeed, your Lord is most knowing of who strays from His way, and He is
        most knowing of the [rightly] guided.
      - >-
        Then they turned away from him and said, "[He was] taught [and is] a
        madman."
      - >-
        When they entered upon him and said, "Peace." [Abraham] said, "Indeed,
        we are fearful of you."
  - source_sentence: فأما من أوتي كتابه بيمينه فيقول هاؤم اقرءوا كتابيه
    sentences:
      - >-
        So as for he who is given his record in his right hand, he will say,
        "Here, read my record!
      - >-
        And whoever is patient and forgives - indeed, that is of the matters
        [requiring] determination.
      - Indeed, he had [once] been among his people in happiness;

SentenceTransformer based on Bofandra/fine-tuning-use-cmlm-multilingual-quran

This is a sentence-transformers model finetuned from Bofandra/fine-tuning-use-cmlm-multilingual-quran. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Bofandra/fine-tuning-use-cmlm-multilingual-quran-translation")
# Run inference
sentences = [
    'فأما من أوتي كتابه بيمينه فيقول هاؤم اقرءوا كتابيه',
    'So as for he who is given his record in his right hand, he will say, "Here, read my record!',
    'Indeed, he had [once] been among his people in happiness;',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 6,225 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 3 tokens
    • mean: 23.11 tokens
    • max: 163 tokens
    • min: 6 tokens
    • mean: 34.63 tokens
    • max: 180 tokens
  • Samples:
    sentence_0 sentence_1
    ومن آياته أنك ترى الأرض خاشعة فإذا أنزلنا عليها الماء اهتزت وربت إن الذي أحياها لمحيي الموتى إنه على كل شيء قدير And of His signs is that you see the earth stilled, but when We send down upon it rain, it quivers and grows. Indeed, He who has given it life is the Giver of Life to the dead. Indeed, He is over all things competent.
    من دون الله قالوا ضلوا عنا بل لم نكن ندعو من قبل شيئا كذلك يضل الله الكافرين Other than Allah?" They will say, "They have departed from us; rather, we did not used to invoke previously anything." Thus does Allah put astray the disbelievers.
    أرأيت الذي ينهى Have you seen the one who forbids
  • Loss: MegaBatchMarginLoss

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • num_train_epochs: 1
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Epoch Step Training Loss
0.3211 500 0.2393
0.6423 1000 0.1212
0.9634 1500 0.0715

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.3.0+cu121
  • Accelerate: 0.31.0
  • Datasets: 2.20.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MegaBatchMarginLoss

@inproceedings{wieting-gimpel-2018-paranmt,
    title = "{P}ara{NMT}-50{M}: Pushing the Limits of Paraphrastic Sentence Embeddings with Millions of Machine Translations",
    author = "Wieting, John and Gimpel, Kevin",
    editor = "Gurevych, Iryna and Miyao, Yusuke",
    booktitle = "Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
    month = jul,
    year = "2018",
    address = "Melbourne, Australia",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/P18-1042",
    doi = "10.18653/v1/P18-1042",
    pages = "451--462",
}