YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
This roberta-base
model was fine-tuned for sequence classificationusing TextAttack
and the rotten_tomatoes dataset loaded using the nlp
library. The model was fine-tuned
for 10 epochs with a batch size of 64, a learning
rate of 2e-05, and a maximum sequence length of 128.
Since this was a classification task, the model was trained with a cross-entropy loss function.
The best score the model achieved on this task was 0.9033771106941839, as measured by the
eval set accuracy, found after 2 epochs.
For more information, check out TextAttack on Github.
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.