File size: 6,961 Bytes
a278bca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
---
base_model: mtheo/camembert-base-xnli
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: legal-data-mDEBERTa-V3-base-mnli-xnli
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# legal-data-mDEBERTa-V3-base-mnli-xnli

This model is a fine-tuned version of [mtheo/camembert-base-xnli](https://huggingface.co/mtheo/camembert-base-xnli) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7972
- Accuracy: 0.7706
- Precision: 0.7891
- Recall: 0.7711
- F1: 0.7697
- Ratio: 0.3154

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 20
- eval_batch_size: 20
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- lr_scheduler_warmup_steps: 4
- num_epochs: 15
- label_smoothing_factor: 0.1

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     | Ratio  |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
| 0.9937        | 0.17  | 10   | 0.8509          | 0.6093   | 0.7022    | 0.6086 | 0.6168 | 0.1935 |
| 0.8433        | 0.34  | 20   | 0.7131          | 0.6667   | 0.6703    | 0.6687 | 0.6666 | 0.3262 |
| 0.8683        | 0.52  | 30   | 0.7101          | 0.7312   | 0.7350    | 0.7324 | 0.7324 | 0.3262 |
| 0.8536        | 0.69  | 40   | 0.7542          | 0.6989   | 0.7152    | 0.6999 | 0.7053 | 0.3011 |
| 0.7964        | 0.86  | 50   | 0.7249          | 0.7670   | 0.7773    | 0.7677 | 0.7630 | 0.3297 |
| 0.7651        | 1.03  | 60   | 0.8253          | 0.7455   | 0.7517    | 0.7465 | 0.7450 | 0.3262 |
| 0.7658        | 1.21  | 70   | 0.8282          | 0.6953   | 0.7366    | 0.6956 | 0.7030 | 0.2688 |
| 0.7297        | 1.38  | 80   | 0.8694          | 0.7634   | 0.7771    | 0.7641 | 0.7612 | 0.3226 |
| 0.7726        | 1.55  | 90   | 0.7898          | 0.7097   | 0.7112    | 0.7112 | 0.7111 | 0.3297 |
| 0.7107        | 1.72  | 100  | 0.8279          | 0.7599   | 0.7642    | 0.7608 | 0.7589 | 0.3297 |
| 0.7204        | 1.9   | 110  | 0.9353          | 0.7240   | 0.7728    | 0.7240 | 0.7279 | 0.2724 |
| 0.7241        | 2.07  | 120  | 0.7903          | 0.7527   | 0.7818    | 0.7530 | 0.7531 | 0.3011 |
| 0.6683        | 2.24  | 130  | 0.8139          | 0.7384   | 0.7931    | 0.7382 | 0.7397 | 0.2760 |
| 0.6832        | 2.41  | 140  | 0.8339          | 0.7993   | 0.8268    | 0.7996 | 0.7913 | 0.3297 |
| 0.7397        | 2.59  | 150  | 0.8309          | 0.7849   | 0.7997    | 0.7855 | 0.7801 | 0.3297 |
| 0.6945        | 2.76  | 160  | 0.7860          | 0.7240   | 0.7259    | 0.7253 | 0.7247 | 0.3297 |
| 0.7067        | 2.93  | 170  | 0.7046          | 0.7957   | 0.8152    | 0.7962 | 0.7898 | 0.3297 |
| 0.6759        | 3.1   | 180  | 0.7465          | 0.7634   | 0.7703    | 0.7643 | 0.7611 | 0.3297 |
| 0.6673        | 3.28  | 190  | 0.8461          | 0.8029   | 0.8234    | 0.8033 | 0.7972 | 0.3297 |
| 0.6748        | 3.45  | 200  | 0.8701          | 0.8065   | 0.8354    | 0.8068 | 0.7987 | 0.3297 |
| 0.7638        | 3.62  | 210  | 0.7501          | 0.8136   | 0.8521    | 0.8138 | 0.8043 | 0.3297 |
| 0.6426        | 3.79  | 220  | 0.7165          | 0.8100   | 0.8379    | 0.8103 | 0.8029 | 0.3297 |
| 0.6569        | 3.97  | 230  | 0.7244          | 0.8100   | 0.8415    | 0.8103 | 0.8020 | 0.3297 |
| 0.7068        | 4.14  | 240  | 0.7448          | 0.8100   | 0.8499    | 0.8102 | 0.8000 | 0.3297 |
| 0.6544        | 4.31  | 250  | 0.8241          | 0.8065   | 0.8476    | 0.8066 | 0.7957 | 0.3297 |
| 0.6261        | 4.48  | 260  | 0.8409          | 0.7634   | 0.7692    | 0.7643 | 0.7617 | 0.3297 |
| 0.6428        | 4.66  | 270  | 0.7887          | 0.7491   | 0.7528    | 0.7501 | 0.7484 | 0.3297 |
| 0.657         | 4.83  | 280  | 0.7534          | 0.7706   | 0.7791    | 0.7714 | 0.7677 | 0.3297 |
| 0.6895        | 5.0   | 290  | 0.7863          | 0.8100   | 0.8379    | 0.8103 | 0.8029 | 0.3297 |
| 0.6422        | 5.17  | 300  | 0.7908          | 0.8136   | 0.8439    | 0.8139 | 0.8062 | 0.3297 |
| 0.5933        | 5.34  | 310  | 0.8330          | 0.7885   | 0.8031    | 0.7891 | 0.7869 | 0.3226 |
| 0.5863        | 5.52  | 320  | 0.8494          | 0.7527   | 0.7598    | 0.7536 | 0.7535 | 0.3226 |
| 0.6787        | 5.69  | 330  | 0.7748          | 0.7742   | 0.7823    | 0.7750 | 0.7716 | 0.3297 |
| 0.6761        | 5.86  | 340  | 0.7256          | 0.7814   | 0.7929    | 0.7820 | 0.7775 | 0.3297 |
| 0.6974        | 6.03  | 350  | 0.7711          | 0.8029   | 0.8208    | 0.8033 | 0.7980 | 0.3297 |
| 0.6083        | 6.21  | 360  | 0.8435          | 0.7993   | 0.8191    | 0.7997 | 0.7951 | 0.3262 |
| 0.5897        | 6.38  | 370  | 0.8773          | 0.7849   | 0.8124    | 0.7852 | 0.7831 | 0.3118 |
| 0.6076        | 6.55  | 380  | 0.8255          | 0.7634   | 0.7683    | 0.7644 | 0.7623 | 0.3297 |
| 0.6709        | 6.72  | 390  | 0.7865          | 0.7527   | 0.7551    | 0.7538 | 0.7530 | 0.3297 |
| 0.7063        | 6.9   | 400  | 0.7898          | 0.8029   | 0.8234    | 0.8033 | 0.7972 | 0.3297 |
| 0.6804        | 7.07  | 410  | 0.7804          | 0.7921   | 0.8152    | 0.7925 | 0.7848 | 0.3297 |
| 0.6227        | 7.24  | 420  | 0.7515          | 0.7706   | 0.7778    | 0.7714 | 0.7683 | 0.3297 |
| 0.6482        | 7.41  | 430  | 0.7758          | 0.7670   | 0.7725    | 0.7679 | 0.7656 | 0.3297 |
| 0.6072        | 7.59  | 440  | 0.8077          | 0.7706   | 0.7792    | 0.7714 | 0.7693 | 0.3262 |
| 0.5695        | 7.76  | 450  | 0.8460          | 0.7921   | 0.8068    | 0.7926 | 0.7892 | 0.3262 |
| 0.6097        | 7.93  | 460  | 0.7856          | 0.7993   | 0.8191    | 0.7997 | 0.7951 | 0.3262 |
| 0.5591        | 8.1   | 470  | 0.7812          | 0.8136   | 0.8447    | 0.8138 | 0.8074 | 0.3262 |
| 0.5573        | 8.28  | 480  | 0.7249          | 0.7849   | 0.7930    | 0.7857 | 0.7828 | 0.3297 |
| 0.6128        | 8.45  | 490  | 0.7245          | 0.7921   | 0.8006    | 0.7928 | 0.7901 | 0.3297 |
| 0.6072        | 8.62  | 500  | 0.7732          | 0.7885   | 0.8038    | 0.7891 | 0.7852 | 0.3262 |
| 0.6276        | 8.79  | 510  | 0.8017          | 0.7885   | 0.8038    | 0.7891 | 0.7852 | 0.3262 |
| 0.647         | 8.97  | 520  | 0.7998          | 0.7885   | 0.8038    | 0.7891 | 0.7852 | 0.3262 |
| 0.5924        | 9.14  | 530  | 0.7972          | 0.7706   | 0.7891    | 0.7711 | 0.7697 | 0.3154 |


### Framework versions

- Transformers 4.39.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2