File size: 6,676 Bytes
aca3c9b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
---
license: apache-2.0
base_model: google-bert/bert-base-cased
tags:
- generated_from_trainer
model-index:
- name: bert-base-cased_legal_ner_finetuned
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bert-base-cased_legal_ner_finetuned

This model is a fine-tuned version of [google-bert/bert-base-cased](https://huggingface.co/google-bert/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3018
- Law Precision: 0.7364
- Law Recall: 0.8261
- Law F1: 0.7787
- Law Number: 115
- Violated by Precision: 0.8525
- Violated by Recall: 0.6933
- Violated by F1: 0.7647
- Violated by Number: 75
- Violated on Precision: 0.4688
- Violated on Recall: 0.4286
- Violated on F1: 0.4478
- Violated on Number: 70
- Violation Precision: 0.6323
- Violation Recall: 0.7251
- Violation F1: 0.6755
- Violation Number: 491
- Overall Precision: 0.6524
- Overall Recall: 0.7097
- Overall F1: 0.6798
- Overall Accuracy: 0.9439

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Law Precision | Law Recall | Law F1 | Law Number | Violated by Precision | Violated by Recall | Violated by F1 | Violated by Number | Violated on Precision | Violated on Recall | Violated on F1 | Violated on Number | Violation Precision | Violation Recall | Violation F1 | Violation Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:-------------:|:----------:|:------:|:----------:|:---------------------:|:------------------:|:--------------:|:------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| No log        | 1.0   | 85   | 0.8046          | 0.0           | 0.0        | 0.0    | 115        | 0.0                   | 0.0                | 0.0            | 75                 | 0.0                   | 0.0                | 0.0            | 70                 | 0.0                 | 0.0              | 0.0          | 491              | 0.0               | 0.0            | 0.0        | 0.7619           |
| No log        | 2.0   | 170  | 0.4050          | 0.0           | 0.0        | 0.0    | 115        | 0.0                   | 0.0                | 0.0            | 75                 | 0.0                   | 0.0                | 0.0            | 70                 | 0.1835              | 0.2037           | 0.1931       | 491              | 0.1835            | 0.1332         | 0.1543     | 0.8819           |
| No log        | 3.0   | 255  | 0.2861          | 0.6111        | 0.4783     | 0.5366 | 115        | 0.1818                | 0.0533             | 0.0825         | 75                 | 0.4                   | 0.0571             | 0.1000         | 70                 | 0.4345              | 0.5540           | 0.4870       | 491              | 0.4479            | 0.4461         | 0.4470     | 0.9130           |
| No log        | 4.0   | 340  | 0.2552          | 0.75          | 0.7043     | 0.7265 | 115        | 0.5625                | 0.36               | 0.4390         | 75                 | 0.3429                | 0.1714             | 0.2286         | 70                 | 0.4924              | 0.5927           | 0.5379       | 491              | 0.5256            | 0.5473         | 0.5362     | 0.9257           |
| No log        | 5.0   | 425  | 0.2676          | 0.7154        | 0.7652     | 0.7395 | 115        | 0.7308                | 0.5067             | 0.5984         | 75                 | 0.2778                | 0.1429             | 0.1887         | 70                 | 0.5368              | 0.6090           | 0.5706       | 491              | 0.5664            | 0.5792         | 0.5727     | 0.9300           |
| 0.4786        | 6.0   | 510  | 0.2663          | 0.6767        | 0.7826     | 0.7258 | 115        | 0.7903                | 0.6533             | 0.7153         | 75                 | 0.3684                | 0.4                | 0.3836         | 70                 | 0.6155              | 0.7271           | 0.6667       | 491              | 0.6157            | 0.6977         | 0.6542     | 0.9366           |
| 0.4786        | 7.0   | 595  | 0.2352          | 0.6957        | 0.8348     | 0.7589 | 115        | 0.7941                | 0.72               | 0.7552         | 75                 | 0.4242                | 0.4                | 0.4118         | 70                 | 0.5799              | 0.7169           | 0.6412       | 491              | 0.6030            | 0.7057         | 0.6503     | 0.9412           |
| 0.4786        | 8.0   | 680  | 0.2728          | 0.6835        | 0.8261     | 0.7480 | 115        | 0.7857                | 0.7333             | 0.7586         | 75                 | 0.3596                | 0.4571             | 0.4025         | 70                 | 0.5916              | 0.7434           | 0.6588       | 491              | 0.5978            | 0.7284         | 0.6567     | 0.9415           |
| 0.4786        | 9.0   | 765  | 0.2952          | 0.7385        | 0.8348     | 0.7837 | 115        | 0.8088                | 0.7333             | 0.7692         | 75                 | 0.5                   | 0.5                | 0.5            | 70                 | 0.6246              | 0.7352           | 0.6754       | 491              | 0.6466            | 0.7284         | 0.6850     | 0.9433           |
| 0.4786        | 10.0  | 850  | 0.3018          | 0.7364        | 0.8261     | 0.7787 | 115        | 0.8525                | 0.6933             | 0.7647         | 75                 | 0.4688                | 0.4286             | 0.4478         | 70                 | 0.6323              | 0.7251           | 0.6755       | 491              | 0.6524            | 0.7097         | 0.6798     | 0.9439           |


### Framework versions

- Transformers 4.44.0
- Pytorch 2.4.0
- Datasets 2.21.0
- Tokenizers 0.19.1