Edit model card

Visualize in Weights & Biases

nllb-200-tiny-tuned

This model is a fine-tuned version of igorktech/nllb-pruned-6L-512d-finetuned-v1 on the your_dataset_name dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0980
  • Bleu: 52.9983
  • Chrf++: 73.2746

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2.0

Training results

Training Loss Epoch Step Validation Loss Bleu Chrf++
0.2285 0.6563 5000 0.1744 37.0851 62.3627
0.1872 1.3125 10000 0.1214 47.5689 69.8186
0.1089 1.9688 15000 0.0980 52.9983 73.2746

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
69.8M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for igorktech/tribble-70m

Adapter
(1)
this model

Collection including igorktech/tribble-70m