library_name: transformers | |
tags: [] | |
# Model Card for Model ID | |
Continuous fine-tuning of deberta-tasksource, fine-tuned on newer tasksource and with context length of size 1024. | |
Upcoming: longer training + 1280 tokens context length. |
library_name: transformers | |
tags: [] | |
# Model Card for Model ID | |
Continuous fine-tuning of deberta-tasksource, fine-tuned on newer tasksource and with context length of size 1024. | |
Upcoming: longer training + 1280 tokens context length. |