Monglish_chatbot / README.md
Ahmedhany216's picture
Upload tokenizer
41f6c05 verified
metadata
base_model: roberta-base
license: mit
metrics:
  - f1
tags:
  - generated_from_trainer
model-index:
  - name: Monglish_chatbot
    results: []

Monglish_chatbot

This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4413
  • F1: 0.9286

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss F1
2.6862 1.0 14 2.7000 0.0
2.7247 2.0 28 2.6786 0.0
2.7909 3.0 42 2.6144 0.3571
2.7791 4.0 56 2.3593 0.4286
2.0472 5.0 70 1.9184 0.5714
1.9152 6.0 84 1.6711 0.6429
1.096 7.0 98 1.4660 0.7143
0.9427 8.0 112 1.2447 0.7143
1.019 9.0 126 1.1048 0.7857
0.7788 10.0 140 0.9651 0.8571
0.6109 11.0 154 0.8429 0.9286
0.6004 12.0 168 0.7675 0.8571
0.3126 13.0 182 0.6817 0.9286
0.2096 14.0 196 0.6414 0.9286
0.3601 15.0 210 0.6030 0.9286
0.3269 16.0 224 0.5345 0.9286
0.2457 17.0 238 0.5132 0.9286
0.1686 18.0 252 0.4870 0.9286
0.0845 19.0 266 0.4625 0.9286
0.1983 20.0 280 0.4521 0.9286
0.0878 21.0 294 0.4449 0.9286
0.1363 22.0 308 0.4533 0.9286
0.1675 23.0 322 0.4502 0.9286
0.1459 24.0 336 0.4437 0.9286
0.1118 25.0 350 0.4413 0.9286

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1