File size: 9,597 Bytes
26fcd08
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
---
tags:
- generated_from_trainer
datasets:
- generator
model-index:
- name: scideberta-cs-tdm-pretrained-finetuned-ner
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# scideberta-cs-tdm-pretrained-finetuned-ner

This model was trained from scratch on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8293
- Overall Precision: 0.6327
- Overall Recall: 0.7460
- Overall F1: 0.6847
- Overall Accuracy: 0.9608
- Datasetname F1: 0.6968
- Hyperparametername F1: 0.6765
- Hyperparametervalue F1: 0.7289
- Methodname F1: 0.7290
- Metricname F1: 0.5269
- Metricvalue F1: 0.8235
- Taskname F1: 0.6099

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100

### Training results

| Training Loss | Epoch | Step | Validation Loss | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | Datasetname F1 | Hyperparametername F1 | Hyperparametervalue F1 | Methodname F1 | Metricname F1 | Metricvalue F1 | Taskname F1 |
|:-------------:|:-----:|:----:|:---------------:|:-----------------:|:--------------:|:----------:|:----------------:|:--------------:|:---------------------:|:----------------------:|:-------------:|:-------------:|:--------------:|:-----------:|
| No log        | 1.0   | 131  | 0.4448          | 0.4113            | 0.6147         | 0.4929     | 0.9353           | 0.5312         | 0.3736                | 0.4818                 | 0.6256        | 0.4667        | 0.2456         | 0.4526      |
| No log        | 2.0   | 262  | 0.3527          | 0.4341            | 0.7067         | 0.5378     | 0.9416           | 0.5347         | 0.4549                | 0.5487                 | 0.6256        | 0.5026        | 0.72           | 0.4593      |
| No log        | 3.0   | 393  | 0.4857          | 0.5794            | 0.6491         | 0.6123     | 0.9544           | 0.6420         | 0.5263                | 0.6011                 | 0.7030        | 0.5276        | 0.7838         | 0.5385      |
| 0.3806        | 4.0   | 524  | 0.3789          | 0.4923            | 0.7485         | 0.5940     | 0.9492           | 0.6358         | 0.5418                | 0.6165                 | 0.6166        | 0.5227        | 0.7826         | 0.5690      |
| 0.3806        | 5.0   | 655  | 0.4563          | 0.5736            | 0.7313         | 0.6429     | 0.9568           | 0.6298         | 0.6176                | 0.7143                 | 0.6824        | 0.5402        | 0.8090         | 0.5463      |
| 0.3806        | 6.0   | 786  | 0.4021          | 0.5199            | 0.7215         | 0.6043     | 0.9525           | 0.6581         | 0.5848                | 0.5603                 | 0.6431        | 0.4973        | 0.7579         | 0.5738      |
| 0.3806        | 7.0   | 917  | 0.4851          | 0.5614            | 0.7460         | 0.6407     | 0.9565           | 0.6506         | 0.6199                | 0.6888                 | 0.6982        | 0.4787        | 0.7826         | 0.5571      |
| 0.0724        | 8.0   | 1048 | 0.5002          | 0.5890            | 0.7350         | 0.6539     | 0.9583           | 0.6316         | 0.6150                | 0.7273                 | 0.7098        | 0.5357        | 0.8140         | 0.5636      |
| 0.0724        | 9.0   | 1179 | 0.5948          | 0.6036            | 0.7325         | 0.6619     | 0.9589           | 0.6839         | 0.6408                | 0.6991                 | 0.7165        | 0.4918        | 0.7692         | 0.6140      |
| 0.0724        | 10.0  | 1310 | 0.5070          | 0.5716            | 0.7497         | 0.6486     | 0.9566           | 0.6582         | 0.6164                | 0.6812                 | 0.6949        | 0.5371        | 0.7692         | 0.5929      |
| 0.0724        | 11.0  | 1441 | 0.6557          | 0.6339            | 0.7350         | 0.6807     | 0.9614           | 0.6883         | 0.6650                | 0.7373                 | 0.7364        | 0.5143        | 0.8293         | 0.5956      |
| 0.0285        | 12.0  | 1572 | 0.5910          | 0.5713            | 0.7374         | 0.6438     | 0.9574           | 0.6835         | 0.6150                | 0.6754                 | 0.7099        | 0.5114        | 0.6792         | 0.5763      |
| 0.0285        | 13.0  | 1703 | 0.6679          | 0.6188            | 0.7350         | 0.6719     | 0.9607           | 0.6928         | 0.6539                | 0.7232                 | 0.7280        | 0.5           | 0.8333         | 0.5728      |
| 0.0285        | 14.0  | 1834 | 0.6856          | 0.6246            | 0.7227         | 0.6701     | 0.9612           | 0.6579         | 0.6256                | 0.7123                 | 0.7452        | 0.5128        | 0.8148         | 0.6018      |
| 0.0285        | 15.0  | 1965 | 0.7225          | 0.6238            | 0.7387         | 0.6764     | 0.9606           | 0.6962         | 0.6586                | 0.7117                 | 0.7290        | 0.4878        | 0.8095         | 0.6283      |
| 0.0154        | 16.0  | 2096 | 0.7242          | 0.5980            | 0.7301         | 0.6575     | 0.9591           | 0.6752         | 0.6293                | 0.6987                 | 0.7148        | 0.5030        | 0.8193         | 0.5714      |
| 0.0154        | 17.0  | 2227 | 0.7268          | 0.6282            | 0.7276         | 0.6742     | 0.9606           | 0.7006         | 0.6568                | 0.7059                 | 0.7255        | 0.5269        | 0.8148         | 0.5963      |
| 0.0154        | 18.0  | 2358 | 0.7498          | 0.6233            | 0.7411         | 0.6771     | 0.9606           | 0.6962         | 0.6402                | 0.7321                 | 0.7280        | 0.5422        | 0.8434         | 0.5899      |
| 0.0154        | 19.0  | 2489 | 0.7161          | 0.6202            | 0.7534         | 0.6803     | 0.9595           | 0.7051         | 0.6479                | 0.7085                 | 0.7524        | 0.5269        | 0.8148         | 0.5919      |
| 0.0104        | 20.0  | 2620 | 0.7926          | 0.6315            | 0.7129         | 0.6697     | 0.9615           | 0.6797         | 0.6502                | 0.7027                 | 0.7269        | 0.5357        | 0.7949         | 0.5905      |
| 0.0104        | 21.0  | 2751 | 0.7827          | 0.6464            | 0.7423         | 0.6910     | 0.9626           | 0.7190         | 0.6751                | 0.7123                 | 0.7395        | 0.5562        | 0.8205         | 0.6197      |
| 0.0104        | 22.0  | 2882 | 0.7285          | 0.6300            | 0.7521         | 0.6857     | 0.9599           | 0.7097         | 0.6782                | 0.7207                 | 0.7215        | 0.5333        | 0.8333         | 0.6188      |
| 0.0049        | 23.0  | 3013 | 0.7645          | 0.6413            | 0.7350         | 0.6850     | 0.9620           | 0.6968         | 0.6717                | 0.7182                 | 0.7301        | 0.5476        | 0.8395         | 0.6066      |
| 0.0049        | 24.0  | 3144 | 0.8071          | 0.6466            | 0.7387         | 0.6896     | 0.9616           | 0.7105         | 0.6886                | 0.7189                 | 0.7362        | 0.5535        | 0.775          | 0.6019      |
| 0.0049        | 25.0  | 3275 | 0.8324          | 0.6319            | 0.7350         | 0.6795     | 0.9611           | 0.7059         | 0.6683                | 0.6964                 | 0.7280        | 0.5366        | 0.8193         | 0.6063      |
| 0.0049        | 26.0  | 3406 | 0.8235          | 0.6355            | 0.7337         | 0.6811     | 0.9606           | 0.6928         | 0.6700                | 0.7189                 | 0.7328        | 0.5610        | 0.8250         | 0.5674      |
| 0.004         | 27.0  | 3537 | 0.8106          | 0.6220            | 0.7411         | 0.6764     | 0.9602           | 0.7089         | 0.6536                | 0.7000                 | 0.7495        | 0.5089        | 0.85           | 0.5611      |
| 0.004         | 28.0  | 3668 | 0.8271          | 0.6353            | 0.7460         | 0.6862     | 0.9611           | 0.7013         | 0.6634                | 0.7054                 | 0.7457        | 0.5644        | 0.8293         | 0.5936      |
| 0.004         | 29.0  | 3799 | 0.8630          | 0.6400            | 0.7374         | 0.6853     | 0.9613           | 0.6923         | 0.6634                | 0.7189                 | 0.7348        | 0.5783        | 0.8537         | 0.5888      |
| 0.004         | 30.0  | 3930 | 0.8055          | 0.6163            | 0.7411         | 0.6730     | 0.9598           | 0.7226         | 0.6522                | 0.7074                 | 0.7063        | 0.5176        | 0.8537         | 0.6161      |
| 0.0029        | 31.0  | 4061 | 0.8293          | 0.6327            | 0.7460         | 0.6847     | 0.9608           | 0.6968         | 0.6765                | 0.7289                 | 0.7290        | 0.5269        | 0.8235         | 0.6099      |


### Framework versions

- Transformers 4.23.1
- Pytorch 1.12.1+cu102
- Datasets 2.6.1
- Tokenizers 0.13.1