File size: 9,045 Bytes
9e3d28f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
---
license: mit
tags:
- generated_from_trainer
model-index:
- name: deberta-v3-base_ai4privacy_en
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# deberta-v3-base_ai4privacy_en

This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1055
- Overall Precision: 0.8683
- Overall Recall: 0.8949
- Overall F1: 0.8814
- Overall Accuracy: 0.9609
- Accountname F1: 0.9898
- Accountnumber F1: 0.9939
- Age F1: 0.8397
- Amount F1: 0.9169
- Bic F1: 0.9012
- Bitcoinaddress F1: 0.9583
- Buildingnumber F1: 0.8109
- City F1: 0.8011
- Companyname F1: 0.9437
- County F1: 0.8752
- Creditcardcvv F1: 0.8635
- Creditcardissuer F1: 0.9738
- Creditcardnumber F1: 0.8771
- Currency F1: 0.6542
- Currencycode F1: 0.5566
- Currencyname F1: 0.2214
- Currencysymbol F1: 0.8640
- Date F1: 0.8365
- Dob F1: 0.5696
- Email F1: 0.9914
- Ethereumaddress F1: 0.9903
- Eyecolor F1: 0.9076
- Firstname F1: 0.8759
- Gender F1: 0.9324
- Height F1: 0.9046
- Iban F1: 0.9899
- Ip F1: 0.1137
- Ipv4 F1: 0.8118
- Ipv6 F1: 0.8091
- Jobarea F1: 0.7895
- Jobtitle F1: 0.9806
- Jobtype F1: 0.9056
- Lastname F1: 0.8179
- Litecoinaddress F1: 0.8739
- Mac F1: 1.0
- Maskednumber F1: 0.8319
- Middlename F1: 0.8419
- Nearbygpscoordinate F1: 1.0
- Ordinaldirection F1: 0.9682
- Password F1: 0.9595
- Phoneimei F1: 0.9930
- Phonenumber F1: 0.9807
- Pin F1: 0.7868
- Prefix F1: 0.9355
- Secondaryaddress F1: 0.9967
- Sex F1: 0.9692
- Ssn F1: 0.9898
- State F1: 0.7407
- Street F1: 0.7823
- Time F1: 0.9500
- Url F1: 0.9936
- Useragent F1: 0.9976
- Username F1: 0.9331
- Vehiclevin F1: 0.9713
- Vehiclevrm F1: 0.9493
- Zipcode F1: 0.8634

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | Accountname F1 | Accountnumber F1 | Age F1 | Amount F1 | Bic F1 | Bitcoinaddress F1 | Buildingnumber F1 | City F1 | Companyname F1 | County F1 | Creditcardcvv F1 | Creditcardissuer F1 | Creditcardnumber F1 | Currency F1 | Currencycode F1 | Currencyname F1 | Currencysymbol F1 | Date F1 | Dob F1 | Email F1 | Ethereumaddress F1 | Eyecolor F1 | Firstname F1 | Gender F1 | Height F1 | Iban F1 | Ip F1  | Ipv4 F1 | Ipv6 F1 | Jobarea F1 | Jobtitle F1 | Jobtype F1 | Lastname F1 | Litecoinaddress F1 | Mac F1 | Maskednumber F1 | Middlename F1 | Nearbygpscoordinate F1 | Ordinaldirection F1 | Password F1 | Phoneimei F1 | Phonenumber F1 | Pin F1 | Prefix F1 | Secondaryaddress F1 | Sex F1 | Ssn F1 | State F1 | Street F1 | Time F1 | Url F1 | Useragent F1 | Username F1 | Vehiclevin F1 | Vehiclevrm F1 | Zipcode F1 |
|:-------------:|:-----:|:-----:|:---------------:|:-----------------:|:--------------:|:----------:|:----------------:|:--------------:|:----------------:|:------:|:---------:|:------:|:-----------------:|:-----------------:|:-------:|:--------------:|:---------:|:----------------:|:-------------------:|:-------------------:|:-----------:|:---------------:|:---------------:|:-----------------:|:-------:|:------:|:--------:|:------------------:|:-----------:|:------------:|:---------:|:---------:|:-------:|:------:|:-------:|:-------:|:----------:|:-----------:|:----------:|:-----------:|:------------------:|:------:|:---------------:|:-------------:|:----------------------:|:-------------------:|:-----------:|:------------:|:--------------:|:------:|:---------:|:-------------------:|:------:|:------:|:--------:|:---------:|:-------:|:------:|:------------:|:-----------:|:-------------:|:-------------:|:----------:|
| 0.463         | 1.0   | 4350  | 0.3229          | 0.5378            | 0.5277         | 0.5327     | 0.8941           | 0.8722         | 0.7667           | 0.5849 | 0.2284    | 0.5391 | 0.7502            | 0.3143            | 0.1514  | 0.2844         | 0.2640    | 0.0086           | 0.5288              | 0.0                 | 0.0956      | 0.0             | 0.0             | 0.3410            | 0.7146  | 0.0169 | 0.8043   | 0.9458             | 0.0090      | 0.4894       | 0.1550    | 0.0       | 0.8653  | 0.0    | 0.8168  | 0.7474  | 0.1611     | 0.4548      | 0.0035     | 0.3781      | 0.1472             | 0.8989 | 0.4641          | 0.0035        | 0.9955                 | 0.0                 | 0.7959      | 0.9464       | 0.7831         | 0.2258 | 0.7847    | 0.8639              | 0.5481 | 0.7480 | 0.0643   | 0.1795    | 0.7463  | 0.9683 | 0.9080       | 0.4569      | 0.8724        | 0.5152        | 0.5458     |
| 0.1944        | 2.0   | 8700  | 0.1709          | 0.7179            | 0.7495         | 0.7334     | 0.9387           | 0.9789         | 0.9718           | 0.6535 | 0.4640    | 0.6039 | 0.9240            | 0.6723            | 0.4777  | 0.8654         | 0.6234    | 0.7241           | 0.8713              | 0.6077              | 0.4598      | 0.0698          | 0.0104          | 0.6163            | 0.7518  | 0.4439 | 0.9803   | 0.9848             | 0.6276      | 0.6714       | 0.7937    | 0.6295    | 0.9538  | 0.0    | 0.8285  | 0.7976  | 0.5304     | 0.9253      | 0.6957     | 0.4694      | 0.7181             | 0.9892 | 0.6301          | 0.2027        | 0.9865                 | 0.8016              | 0.7931      | 0.9888       | 0.9658         | 0.3231 | 0.8959    | 0.9721              | 0.8506 | 0.9692 | 0.3841   | 0.4389    | 0.9064  | 0.9905 | 0.9670       | 0.8341      | 0.9563        | 0.8449        | 0.7487     |
| 0.1275        | 3.0   | 13050 | 0.1174          | 0.8276            | 0.8506         | 0.8390     | 0.9559           | 0.9881         | 0.9896           | 0.7347 | 0.8484    | 0.8214 | 0.9571            | 0.7815            | 0.7437  | 0.9289         | 0.7794    | 0.8323           | 0.9754              | 0.8624              | 0.4890      | 0.4318          | 0.2006          | 0.8043            | 0.8066  | 0.5459 | 0.9858   | 0.9903             | 0.8511      | 0.8071       | 0.8187    | 0.8657    | 0.9486  | 0.0    | 0.8396  | 0.8049  | 0.7326     | 0.9720      | 0.8699     | 0.6714      | 0.8655             | 0.9957 | 0.8194          | 0.6478        | 1.0                    | 0.9660              | 0.9331      | 0.9916       | 0.9711         | 0.6899 | 0.9302    | 0.9902              | 0.9413 | 0.9847 | 0.5684   | 0.7259    | 0.9381  | 0.9929 | 0.9953       | 0.9094      | 0.9598        | 0.9115        | 0.8324     |
| 0.0976        | 4.0   | 17400 | 0.1065          | 0.8624            | 0.8877         | 0.8749     | 0.9598           | 0.9907         | 0.9939           | 0.8312 | 0.9141    | 0.8689 | 0.9511            | 0.8027            | 0.8014  | 0.9538         | 0.8827    | 0.8599           | 0.9701              | 0.8634              | 0.6637      | 0.5488          | 0.1181          | 0.8541            | 0.8224  | 0.5333 | 0.9926   | 0.9876             | 0.9041      | 0.8664       | 0.9303    | 0.9207    | 0.9861  | 0.0591 | 0.8174  | 0.8098  | 0.7798     | 0.9686      | 0.9013     | 0.7845      | 0.8661             | 1.0    | 0.8091          | 0.8103        | 1.0                    | 0.9785              | 0.9430      | 0.9916       | 0.9806         | 0.7778 | 0.9354    | 0.9913              | 0.9692 | 0.9885 | 0.7476   | 0.7658    | 0.9427  | 0.9889 | 0.9976       | 0.9346      | 0.9797        | 0.9570        | 0.8362     |
| 0.0886        | 5.0   | 21750 | 0.1055          | 0.8683            | 0.8949         | 0.8814     | 0.9609           | 0.9898         | 0.9939           | 0.8397 | 0.9169    | 0.9012 | 0.9583            | 0.8109            | 0.8011  | 0.9437         | 0.8752    | 0.8635           | 0.9738              | 0.8771              | 0.6542      | 0.5566          | 0.2214          | 0.8640            | 0.8365  | 0.5696 | 0.9914   | 0.9903             | 0.9076      | 0.8759       | 0.9324    | 0.9046    | 0.9899  | 0.1137 | 0.8118  | 0.8091  | 0.7895     | 0.9806      | 0.9056     | 0.8179      | 0.8739             | 1.0    | 0.8319          | 0.8419        | 1.0                    | 0.9682              | 0.9595      | 0.9930       | 0.9807         | 0.7868 | 0.9355    | 0.9967              | 0.9692 | 0.9898 | 0.7407   | 0.7823    | 0.9500  | 0.9936 | 0.9976       | 0.9331      | 0.9713        | 0.9493        | 0.8634     |


### Framework versions

- Transformers 4.26.1
- Pytorch 2.0.0.post101
- Datasets 2.10.1
- Tokenizers 0.13.3