sileod's picture
Update README.md
de24c2a verified
|
raw
history blame
243 Bytes
---
library_name: transformers
tags: []
---
# Model Card for Model ID
Continuous fine-tuning of deberta-tasksource, fine-tuned on newer tasksource and with context length of size 1024.
Upcoming: longer training + 1280 tokens context length.