library_name: transformers | |
pipeline_tag: zero-shot-classification | |
# Model Card for Model ID | |
Continuous fine-tuning of deberta-tasksource, fine-tuned on newer tasksource and with context length of size 1024. | |
Upcoming: longer training + 1280 tokens context length. |