File size: 6,432 Bytes
4aa64aa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
---
license: cc-by-sa-4.0
base_model: nlpaueb/legal-bert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: Flavio
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Flavio

This model is a fine-tuned version of [nlpaueb/legal-bert-base-uncased](https://huggingface.co/nlpaueb/legal-bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3914
- Accuracy: 0.9150
- F1 Macro: 0.8231
- F1 Class 0: 0.9472
- F1 Class 1: 0.6667
- F1 Class 2: 0.9259
- F1 Class 3: 0.8421
- F1 Class 4: 0.9
- F1 Class 5: 0.9615
- F1 Class 6: 0.8
- F1 Class 7: 0.9556
- F1 Class 8: 0.9655
- F1 Class 9: 0.8621
- F1 Class 10: 0.8924
- F1 Class 11: 0.7143
- F1 Class 12: 0.8101
- F1 Class 13: 0.75
- F1 Class 14: 0.8889
- F1 Class 15: 0.7500
- F1 Class 16: 0.0
- F1 Class 17: 0.9880
- F1 Class 18: 0.9180
- F1 Class 19: 0.9231

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | F1 Class 0 | F1 Class 1 | F1 Class 2 | F1 Class 3 | F1 Class 4 | F1 Class 5 | F1 Class 6 | F1 Class 7 | F1 Class 8 | F1 Class 9 | F1 Class 10 | F1 Class 11 | F1 Class 12 | F1 Class 13 | F1 Class 14 | F1 Class 15 | F1 Class 16 | F1 Class 17 | F1 Class 18 | F1 Class 19 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|
| 1.2343        | 0.39  | 250  | 0.7445          | 0.8363   | 0.5500   | 0.875      | 0.0        | 0.8959     | 0.8421     | 0.0769     | 0.6818     | 0.6667     | 0.9556     | 0.9492     | 0.6190     | 0.8339      | 0.0         | 0.7442      | 0.2000      | 0.8267      | 0.0         | 0.0         | 0.9760      | 0.8571      | 0.0         |
| 0.5654        | 0.79  | 500  | 0.5466          | 0.8690   | 0.6846   | 0.9124     | 0.0        | 0.9189     | 0.8421     | 0.7660     | 0.8302     | 0.6531     | 0.9663     | 0.9310     | 0.7353     | 0.8580      | 0.0         | 0.7564      | 0.8889      | 0.8272      | 0.1         | 0.0         | 0.9759      | 0.8070      | 0.9231      |
| 0.439         | 1.18  | 750  | 0.4626          | 0.8832   | 0.7211   | 0.9209     | 0.0        | 0.9217     | 0.8421     | 0.8        | 0.9057     | 0.6667     | 0.9556     | 0.9455     | 0.8000     | 0.8554      | 0.2857      | 0.7799      | 0.8889      | 0.8462      | 0.1905      | 0.0         | 0.9759      | 0.9180      | 0.9231      |
| 0.3397        | 1.57  | 1000 | 0.4744          | 0.8885   | 0.7457   | 0.9207     | 0.0        | 0.9327     | 0.8421     | 0.7826     | 0.8364     | 0.7547     | 0.9663     | 0.9655     | 0.7273     | 0.8735      | 0.6667      | 0.8077      | 0.8889      | 0.8553      | 0.32        | 0.0         | 0.9730      | 0.8772      | 0.9231      |
| 0.3351        | 1.97  | 1250 | 0.4128          | 0.8938   | 0.7784   | 0.9350     | 0.4        | 0.9217     | 0.8000     | 0.8108     | 0.8519     | 0.6939     | 0.9663     | 0.9474     | 0.7719     | 0.8563      | 0.7692      | 0.8199      | 0.8889      | 0.8903      | 0.4800      | 0.0         | 0.9790      | 0.8621      | 0.9231      |
| 0.2384        | 2.36  | 1500 | 0.3982          | 0.9071   | 0.8016   | 0.9431     | 0.4        | 0.9259     | 0.8421     | 0.9048     | 0.8772     | 0.8333     | 0.9556     | 0.9655     | 0.8302     | 0.8810      | 0.6667      | 0.7922      | 0.8889      | 0.8961      | 0.5882      | 0.0         | 0.9850      | 0.9333      | 0.9231      |
| 0.2309        | 2.75  | 1750 | 0.3741          | 0.9133   | 0.8191   | 0.9494     | 0.6667     | 0.9266     | 0.8421     | 0.8780     | 0.9091     | 0.8197     | 0.9556     | 0.9655     | 0.84       | 0.8831      | 0.625       | 0.8026      | 0.8235      | 0.9032      | 0.7647      | 0.0         | 0.9880      | 0.9153      | 0.9231      |
| 0.2243        | 3.14  | 2000 | 0.3962          | 0.9080   | 0.8146   | 0.9435     | 0.5714     | 0.9302     | 0.8421     | 0.9        | 0.9804     | 0.7059     | 0.9556     | 0.9492     | 0.8727     | 0.8765      | 0.7692      | 0.8050      | 0.8235      | 0.8889      | 0.6452      | 0.0         | 0.9760      | 0.9333      | 0.9231      |
| 0.1781        | 3.54  | 2250 | 0.3775          | 0.9133   | 0.8137   | 0.9418     | 0.4        | 0.9395     | 0.8421     | 0.9        | 0.9091     | 0.8814     | 0.9556     | 0.9655     | 0.8421     | 0.8952      | 0.7143      | 0.8077      | 0.8235      | 0.8679      | 0.7500      | 0.0         | 0.9816      | 0.9333      | 0.9231      |
| 0.169         | 3.93  | 2500 | 0.4092          | 0.9080   | 0.8157   | 0.9395     | 0.6667     | 0.9224     | 0.8421     | 0.9        | 0.9091     | 0.8136     | 0.9556     | 0.9655     | 0.8621     | 0.8825      | 0.6667      | 0.8077      | 0.75        | 0.8701      | 0.7500      | 0.0         | 0.9879      | 0.9         | 0.9231      |
| 0.1406        | 4.32  | 2750 | 0.3886          | 0.9097   | 0.8244   | 0.9424     | 0.5714     | 0.9266     | 0.8421     | 0.9048     | 0.9615     | 0.7931     | 0.9556     | 0.9492     | 0.8667     | 0.8790      | 0.7692      | 0.7949      | 0.8889      | 0.8718      | 0.7273      | 0.0         | 0.9849      | 0.9355      | 0.9231      |
| 0.1245        | 4.72  | 3000 | 0.3914          | 0.9150   | 0.8231   | 0.9472     | 0.6667     | 0.9259     | 0.8421     | 0.9        | 0.9615     | 0.8        | 0.9556     | 0.9655     | 0.8621     | 0.8924      | 0.7143      | 0.8101      | 0.75        | 0.8889      | 0.7500      | 0.0         | 0.9880      | 0.9180      | 0.9231      |


### Framework versions

- Transformers 4.32.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.4
- Tokenizers 0.13.3