File size: 5,356 Bytes
f5e1889
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
---
license: mit
base_model: prajjwal1/bert-tiny
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: MM03-PC
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# MM03-PC

This model is a fine-tuned version of [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5909
- Accuracy: 0.71
- F1: 0.8304

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1     |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log        | 0.0   | 50   | 0.6916          | 0.53     | 0.3672 |
| No log        | 0.01  | 100  | 0.6924          | 0.53     | 0.3672 |
| No log        | 0.01  | 150  | 0.6922          | 0.53     | 0.3672 |
| No log        | 0.01  | 200  | 0.6927          | 0.56     | 0.5593 |
| No log        | 0.02  | 250  | 0.6903          | 0.53     | 0.3672 |
| No log        | 0.02  | 300  | 0.6884          | 0.53     | 0.3672 |
| No log        | 0.03  | 350  | 0.6875          | 0.53     | 0.3842 |
| No log        | 0.03  | 400  | 0.6865          | 0.59     | 0.5100 |
| No log        | 0.03  | 450  | 0.6835          | 0.59     | 0.5193 |
| 0.6925        | 0.04  | 500  | 0.6792          | 0.58     | 0.5732 |
| 0.6925        | 0.04  | 550  | 0.6717          | 0.74     | 0.7324 |
| 0.6925        | 0.04  | 600  | 0.6558          | 0.73     | 0.7248 |
| 0.6925        | 0.05  | 650  | 0.6456          | 0.65     | 0.6286 |
| 0.6925        | 0.05  | 700  | 0.6371          | 0.74     | 0.7342 |
| 0.6925        | 0.06  | 750  | 0.6353          | 0.64     | 0.6403 |
| 0.6925        | 0.06  | 800  | 0.6331          | 0.72     | 0.7096 |
| 0.6925        | 0.06  | 850  | 0.6298          | 0.73     | 0.7248 |
| 0.6925        | 0.07  | 900  | 0.6341          | 0.69     | 0.6743 |
| 0.6925        | 0.07  | 950  | 0.6302          | 0.61     | 0.6102 |
| 0.6691        | 0.07  | 1000 | 0.6161          | 0.63     | 0.6297 |
| 0.6691        | 0.08  | 1050 | 0.6035          | 0.75     | 0.7486 |
| 0.6691        | 0.08  | 1100 | 0.6015          | 0.74     | 0.7370 |
| 0.6691        | 0.08  | 1150 | 0.5958          | 0.73     | 0.7298 |
| 0.6691        | 0.09  | 1200 | 0.5895          | 0.73     | 0.7263 |
| 0.6691        | 0.09  | 1250 | 0.5921          | 0.73     | 0.7263 |
| 0.6691        | 0.1   | 1300 | 0.5935          | 0.73     | 0.7285 |
| 0.6691        | 0.1   | 1350 | 0.5853          | 0.73     | 0.7275 |
| 0.6691        | 0.1   | 1400 | 0.5952          | 0.74     | 0.7381 |
| 0.6691        | 0.11  | 1450 | 0.5811          | 0.76     | 0.7582 |
| 0.6482        | 0.11  | 1500 | 0.5849          | 0.7      | 0.6933 |
| 0.6482        | 0.11  | 1550 | 0.5827          | 0.71     | 0.7044 |
| 0.6482        | 0.12  | 1600 | 0.5741          | 0.71     | 0.7026 |
| 0.6482        | 0.12  | 1650 | 0.5782          | 0.73     | 0.7275 |
| 0.6482        | 0.12  | 1700 | 0.5704          | 0.74     | 0.7370 |
| 0.6482        | 0.13  | 1750 | 0.5704          | 0.74     | 0.7396 |
| 0.6482        | 0.13  | 1800 | 0.5592          | 0.72     | 0.7154 |
| 0.6482        | 0.14  | 1850 | 0.5661          | 0.72     | 0.7137 |
| 0.6482        | 0.14  | 1900 | 0.5762          | 0.71     | 0.7044 |
| 0.6482        | 0.14  | 1950 | 0.5702          | 0.71     | 0.7044 |
| 0.6226        | 0.15  | 2000 | 0.5677          | 0.73     | 0.7285 |
| 0.6226        | 0.15  | 2050 | 0.5649          | 0.73     | 0.7285 |
| 0.6226        | 0.15  | 2100 | 0.5583          | 0.74     | 0.7370 |
| 0.6226        | 0.16  | 2150 | 0.5712          | 0.7      | 0.6951 |
| 0.6226        | 0.16  | 2200 | 0.5661          | 0.7      | 0.6951 |
| 0.6226        | 0.17  | 2250 | 0.5452          | 0.76     | 0.7573 |
| 0.6226        | 0.17  | 2300 | 0.5448          | 0.75     | 0.7493 |
| 0.6226        | 0.17  | 2350 | 0.5424          | 0.75     | 0.7493 |
| 0.6226        | 0.18  | 2400 | 0.5444          | 0.75     | 0.7477 |
| 0.6226        | 0.18  | 2450 | 0.5400          | 0.75     | 0.7477 |
| 0.6058        | 0.18  | 2500 | 0.5393          | 0.75     | 0.7493 |
| 0.6058        | 0.19  | 2550 | 0.5495          | 0.75     | 0.7486 |
| 0.6058        | 0.19  | 2600 | 0.5309          | 0.76     | 0.7590 |
| 0.6058        | 0.19  | 2650 | 0.5242          | 0.73     | 0.7298 |
| 0.6058        | 0.2   | 2700 | 0.5239          | 0.73     | 0.7298 |
| 0.6058        | 0.2   | 2750 | 0.5201          | 0.71     | 0.7098 |
| 0.6058        | 0.21  | 2800 | 0.5087          | 0.73     | 0.7285 |
| 0.6058        | 0.21  | 2850 | 0.5041          | 0.75     | 0.7486 |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.0
- Tokenizers 0.15.0