CER: 15.4%
transformers-4.46.3
Train Args:
per_device_train_batch_size=32,
gradient_accumulation_steps=1,
learning_rate=1e-5,
gradient_checkpointing=True,
per_device_eval_batch_size=64,
generation_max_length=225,
Hardware:
NVIDIA Tesla V100 16GB * 4
FAQ:
- If having tokenizer issue during inference, please update your transformers version to >= 4.46.3
pip install --upgrade transformers==4.46.3
- Downloads last month
- 241
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.