Update README.md
Browse files
README.md
CHANGED
@@ -152,7 +152,7 @@ library_name: transformers
|
|
152 |
|
153 |
# Model Card for DeBERTa-v3-base-tasksource-nli
|
154 |
|
155 |
-
DeBERTa model jointly fine-tuned on 444 tasks of the tasksource collection https://github.com/sileod/tasksource/
|
156 |
This is the model with the MNLI classifier on top. Its encoder was trained on many datasets including bigbench, Anthropic/hh-rlhf... alongside many NLI and classification tasks with a SequenceClassification heads while using only one shared encoder.
|
157 |
|
158 |
Each task had a specific CLS embedding, which is dropped 10% of the time to facilitate model use without it. All multiple-choice model used the same classification layers. For classification tasks, models shared weights if their labels matched.
|
|
|
152 |
|
153 |
# Model Card for DeBERTa-v3-base-tasksource-nli
|
154 |
|
155 |
+
DeBERTa pretrained model jointly fine-tuned on 444 tasks of the tasksource collection https://github.com/sileod/tasksource/
|
156 |
This is the model with the MNLI classifier on top. Its encoder was trained on many datasets including bigbench, Anthropic/hh-rlhf... alongside many NLI and classification tasks with a SequenceClassification heads while using only one shared encoder.
|
157 |
|
158 |
Each task had a specific CLS embedding, which is dropped 10% of the time to facilitate model use without it. All multiple-choice model used the same classification layers. For classification tasks, models shared weights if their labels matched.
|