Edit model card

TiRoBERTa-GeezSwitch

This model is a fine-tuned version of fgaim/tiroberta-base on the GeezSwitch dataset.

It achieves the following results on the test set:

  • F1: 0.9948
  • Recall: 0.9948
  • Precision: 0.9948
  • Accuracy: 0.9948
  • Loss: 0.0222

Training

Hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3.0
  • seed: 42

Framework versions

  • Transformers 4.19.0.dev0
  • Pytorch 1.11.0+cu113
  • Datasets 2.1.0
  • Tokenizers 0.12.1

Citation

If you use this model or the GeezSwitch model in your research, please cite as follows:

@inproceedings{fgaim2022geezswitch,
  title={GeezSwitch: Language Identification in Typologically Related Low-resourced East African Languages},
  author={Fitsum Gaim and Wonsuk Yang and Jong C. Park},
  booktitle={Proceedings of the 13th Language Resources and Evaluation Conference},
  year={2022}
}
Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for fgaim/tiroberta-geezswitch

Adapters
1 model