File size: 9,042 Bytes
0e7cf6c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
---
license: mit
tags:
- generated_from_trainer
model-index:
- name: roberta-base_ai4privacy_en
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# roberta-base_ai4privacy_en

This model is a fine-tuned version of [FacebookAI/roberta-base](https://huggingface.co/FacebookAI/roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0962
- Overall Precision: 0.8739
- Overall Recall: 0.9046
- Overall F1: 0.8890
- Overall Accuracy: 0.9623
- Accountname F1: 0.9898
- Accountnumber F1: 0.9896
- Age F1: 0.8745
- Amount F1: 0.8663
- Bic F1: 0.8782
- Bitcoinaddress F1: 0.9414
- Buildingnumber F1: 0.8279
- City F1: 0.8312
- Companyname F1: 0.9434
- County F1: 0.9279
- Creditcardcvv F1: 0.8947
- Creditcardissuer F1: 0.9755
- Creditcardnumber F1: 0.8770
- Currency F1: 0.6753
- Currencycode F1: 0.6398
- Currencyname F1: 0.2105
- Currencysymbol F1: 0.9223
- Date F1: 0.8276
- Dob F1: 0.5470
- Email F1: 0.9840
- Ethereumaddress F1: 0.9972
- Eyecolor F1: 0.9027
- Firstname F1: 0.8696
- Gender F1: 0.9627
- Height F1: 0.9811
- Iban F1: 0.9912
- Ip F1: 0.0124
- Ipv4 F1: 0.8377
- Ipv6 F1: 0.7585
- Jobarea F1: 0.8212
- Jobtitle F1: 0.9833
- Jobtype F1: 0.9110
- Lastname F1: 0.8305
- Litecoinaddress F1: 0.8793
- Mac F1: 0.9957
- Maskednumber F1: 0.8315
- Middlename F1: 0.9441
- Nearbygpscoordinate F1: 0.9970
- Ordinaldirection F1: 0.9682
- Password F1: 0.9654
- Phoneimei F1: 0.9944
- Phonenumber F1: 0.9860
- Pin F1: 0.8150
- Prefix F1: 0.9306
- Secondaryaddress F1: 0.9935
- Sex F1: 0.9721
- Ssn F1: 0.9759
- State F1: 0.8817
- Street F1: 0.8264
- Time F1: 0.9485
- Url F1: 0.9936
- Useragent F1: 0.9976
- Username F1: 0.9108
- Vehiclevin F1: 0.9568
- Vehiclevrm F1: 0.9239
- Zipcode F1: 0.8543

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | Accountname F1 | Accountnumber F1 | Age F1 | Amount F1 | Bic F1 | Bitcoinaddress F1 | Buildingnumber F1 | City F1 | Companyname F1 | County F1 | Creditcardcvv F1 | Creditcardissuer F1 | Creditcardnumber F1 | Currency F1 | Currencycode F1 | Currencyname F1 | Currencysymbol F1 | Date F1 | Dob F1 | Email F1 | Ethereumaddress F1 | Eyecolor F1 | Firstname F1 | Gender F1 | Height F1 | Iban F1 | Ip F1  | Ipv4 F1 | Ipv6 F1 | Jobarea F1 | Jobtitle F1 | Jobtype F1 | Lastname F1 | Litecoinaddress F1 | Mac F1 | Maskednumber F1 | Middlename F1 | Nearbygpscoordinate F1 | Ordinaldirection F1 | Password F1 | Phoneimei F1 | Phonenumber F1 | Pin F1 | Prefix F1 | Secondaryaddress F1 | Sex F1 | Ssn F1 | State F1 | Street F1 | Time F1 | Url F1 | Useragent F1 | Username F1 | Vehiclevin F1 | Vehiclevrm F1 | Zipcode F1 |
|:-------------:|:-----:|:-----:|:---------------:|:-----------------:|:--------------:|:----------:|:----------------:|:--------------:|:----------------:|:------:|:---------:|:------:|:-----------------:|:-----------------:|:-------:|:--------------:|:---------:|:----------------:|:-------------------:|:-------------------:|:-----------:|:---------------:|:---------------:|:-----------------:|:-------:|:------:|:--------:|:------------------:|:-----------:|:------------:|:---------:|:---------:|:-------:|:------:|:-------:|:-------:|:----------:|:-----------:|:----------:|:-----------:|:------------------:|:------:|:---------------:|:-------------:|:----------------------:|:-------------------:|:-----------:|:------------:|:--------------:|:------:|:---------:|:-------------------:|:------:|:------:|:--------:|:---------:|:-------:|:------:|:------------:|:-----------:|:-------------:|:-------------:|:----------:|
| 0.3911        | 1.0   | 2175  | 0.2642          | 0.6000            | 0.6420         | 0.6203     | 0.9119           | 0.9177         | 0.8043           | 0.6417 | 0.2664    | 0.4444 | 0.7762            | 0.2639            | 0.3614  | 0.6320         | 0.5282    | 0.5097           | 0.8493              | 0.4381              | 0.2180      | 0.0             | 0.0             | 0.4754            | 0.6817  | 0.0    | 0.9518   | 0.9710             | 0.5435      | 0.5869       | 0.6865    | 0.5455    | 0.7382  | 0.0    | 0.7780  | 0.7260  | 0.3064     | 0.6099      | 0.5096     | 0.4491      | 0.5872             | 0.8521 | 0.4547          | 0.0365        | 0.9822                 | 0.7915              | 0.7728      | 0.9553       | 0.8337         | 0.1159 | 0.8853    | 0.8930              | 0.9520 | 0.7916 | 0.1731   | 0.2881    | 0.7877  | 0.9793 | 0.9202       | 0.6108      | 0.7374        | 0.4551        | 0.5222     |
| 0.1748        | 2.0   | 4350  | 0.1410          | 0.7732            | 0.8035         | 0.7880     | 0.9479           | 0.9831         | 0.9575           | 0.8131 | 0.6771    | 0.8161 | 0.8795            | 0.6822            | 0.6022  | 0.8531         | 0.6954    | 0.8056           | 0.9663              | 0.8012              | 0.5330      | 0.3009          | 0.0571          | 0.8293            | 0.7760  | 0.3798 | 0.9646   | 0.9675             | 0.8677      | 0.6901       | 0.9239    | 0.9655    | 0.9073  | 0.0    | 0.8345  | 0.7913  | 0.7190     | 0.9331      | 0.8958     | 0.5220      | 0.7748             | 0.9913 | 0.7372          | 0.4010        | 0.9925                 | 0.9558              | 0.8982      | 0.9359       | 0.9586         | 0.6094 | 0.8621    | 0.9891              | 0.9702 | 0.9464 | 0.5906   | 0.5943    | 0.9437  | 0.9936 | 0.9369       | 0.8300      | 0.9157        | 0.8207        | 0.7405     |
| 0.1081        | 3.0   | 6525  | 0.1143          | 0.8376            | 0.8825         | 0.8595     | 0.9559           | 0.9865         | 0.9803           | 0.8586 | 0.7828    | 0.8154 | 0.8297            | 0.7920            | 0.7957  | 0.9143         | 0.8413    | 0.8218           | 0.9628              | 0.8634              | 0.6290      | 0.5636          | 0.1324          | 0.8788            | 0.8283  | 0.4895 | 0.9797   | 0.9917             | 0.8895      | 0.8303       | 0.9294    | 0.9718    | 0.9746  | 0.0    | 0.8325  | 0.7976  | 0.7521     | 0.9647      | 0.9140     | 0.7495      | 0.7371             | 0.9848 | 0.7944          | 0.8836        | 0.9955                 | 0.9701              | 0.9227      | 0.9944       | 0.9785         | 0.7427 | 0.9254    | 0.9924              | 0.9701 | 0.9527 | 0.8031   | 0.7519    | 0.9288  | 0.9929 | 0.9848       | 0.8880      | 0.9391        | 0.9251        | 0.8403     |
| 0.0804        | 4.0   | 8700  | 0.0962          | 0.8739            | 0.9046         | 0.8890     | 0.9623           | 0.9898         | 0.9896           | 0.8745 | 0.8663    | 0.8782 | 0.9414            | 0.8279            | 0.8312  | 0.9434         | 0.9279    | 0.8947           | 0.9755              | 0.8770              | 0.6753      | 0.6398          | 0.2105          | 0.9223            | 0.8276  | 0.5470 | 0.9840   | 0.9972             | 0.9027      | 0.8696       | 0.9627    | 0.9811    | 0.9912  | 0.0124 | 0.8377  | 0.7585  | 0.8212     | 0.9833      | 0.9110     | 0.8305      | 0.8793             | 0.9957 | 0.8315          | 0.9441        | 0.9970                 | 0.9682              | 0.9654      | 0.9944       | 0.9860         | 0.8150 | 0.9306    | 0.9935              | 0.9721 | 0.9759 | 0.8817   | 0.8264    | 0.9485  | 0.9936 | 0.9976       | 0.9108      | 0.9568        | 0.9239        | 0.8543     |
| 0.0663        | 5.0   | 10875 | 0.0965          | 0.8761            | 0.9089         | 0.8922     | 0.9632           | 0.9882         | 0.9896           | 0.8845 | 0.8676    | 0.8750 | 0.9463            | 0.8280            | 0.8415  | 0.9482         | 0.9365    | 0.8954           | 0.9774              | 0.8897              | 0.6571      | 0.6773          | 0.2690          | 0.9217            | 0.8259  | 0.6135 | 0.9859   | 0.9972             | 0.9180      | 0.8840       | 0.9708    | 0.975     | 0.9667  | 0.1201 | 0.8109  | 0.7064  | 0.8298     | 0.9885      | 0.9265     | 0.8520      | 0.8844             | 0.9935 | 0.8523          | 0.9462        | 0.9985                 | 0.9744              | 0.9682      | 0.9958       | 0.9881         | 0.8166 | 0.9333    | 0.9935              | 0.9721 | 0.9772 | 0.8889   | 0.8294    | 0.9632  | 0.9952 | 0.9976       | 0.9172      | 0.9538        | 0.9418        | 0.8638     |


### Framework versions

- Transformers 4.26.1
- Pytorch 2.0.0.post200
- Datasets 2.10.1
- Tokenizers 0.13.3