File size: 11,179 Bytes
23fcb42
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
---
license: mit
base_model: FacebookAI/xlm-roberta-large
tags:
- generated_from_trainer
model-index:
- name: roberta-large-ner-ghtk-cs-6-label-new-data-3090-14Sep-1
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# roberta-large-ner-ghtk-cs-6-label-new-data-3090-14Sep-1

This model is a fine-tuned version of [FacebookAI/xlm-roberta-large](https://huggingface.co/FacebookAI/xlm-roberta-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1772
- Tk: {'precision': 0.7345132743362832, 'recall': 0.7155172413793104, 'f1': 0.7248908296943231, 'number': 116}
- Gày: {'precision': 0.7380952380952381, 'recall': 0.9117647058823529, 'f1': 0.8157894736842106, 'number': 34}
- Gày trừu tượng: {'precision': 0.9118852459016393, 'recall': 0.9118852459016393, 'f1': 0.9118852459016393, 'number': 488}
- Ã đơn: {'precision': 0.8514851485148515, 'recall': 0.8472906403940886, 'f1': 0.8493827160493828, 'number': 203}
- Đt: {'precision': 0.9291084854994629, 'recall': 0.9851936218678815, 'f1': 0.9563294637921502, 'number': 878}
- Đt trừu tượng: {'precision': 0.8259109311740891, 'recall': 0.8755364806866953, 'f1': 0.85, 'number': 233}
- Overall Precision: 0.8898
- Overall Recall: 0.9221
- Overall F1: 0.9057
- Overall Accuracy: 0.9665

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Tk                                                                                                       | Gày                                                                                                     | Gày trừu tượng                                                                                           | Ã đơn                                                                                                    | Đt                                                                                                       | Đt trừu tượng                                                                                            | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| No log        | 1.0   | 467  | 0.1542          | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 116}                                              | {'precision': 0.5384615384615384, 'recall': 0.8235294117647058, 'f1': 0.6511627906976744, 'number': 34} | {'precision': 0.7964285714285714, 'recall': 0.9139344262295082, 'f1': 0.8511450381679388, 'number': 488} | {'precision': 0.8045977011494253, 'recall': 0.6896551724137931, 'f1': 0.7427055702917771, 'number': 203} | {'precision': 0.8341323106423778, 'recall': 0.9908883826879271, 'f1': 0.9057782404997397, 'number': 878} | {'precision': 0.7639484978540773, 'recall': 0.7639484978540773, 'f1': 0.7639484978540771, 'number': 233} | 0.8021            | 0.8514         | 0.8260     | 0.9463           |
| 0.263         | 2.0   | 934  | 0.1126          | {'precision': 0.6966292134831461, 'recall': 0.5344827586206896, 'f1': 0.6048780487804878, 'number': 116} | {'precision': 0.6041666666666666, 'recall': 0.8529411764705882, 'f1': 0.7073170731707317, 'number': 34} | {'precision': 0.8535645472061657, 'recall': 0.9077868852459017, 'f1': 0.8798411122144986, 'number': 488} | {'precision': 0.9135802469135802, 'recall': 0.729064039408867, 'f1': 0.810958904109589, 'number': 203}   | {'precision': 0.9209694415173867, 'recall': 0.9954441913439636, 'f1': 0.9567597153804049, 'number': 878} | {'precision': 0.7540983606557377, 'recall': 0.7896995708154506, 'f1': 0.7714884696016772, 'number': 233} | 0.8652            | 0.8914         | 0.8781     | 0.9574           |
| 0.1028        | 3.0   | 1401 | 0.1177          | {'precision': 0.868421052631579, 'recall': 0.5689655172413793, 'f1': 0.6875000000000001, 'number': 116}  | {'precision': 0.6122448979591837, 'recall': 0.8823529411764706, 'f1': 0.7228915662650602, 'number': 34} | {'precision': 0.8669275929549902, 'recall': 0.9077868852459017, 'f1': 0.8868868868868869, 'number': 488} | {'precision': 0.8507462686567164, 'recall': 0.8423645320197044, 'f1': 0.8465346534653465, 'number': 203} | {'precision': 0.9059561128526645, 'recall': 0.9874715261958997, 'f1': 0.9449591280653951, 'number': 878} | {'precision': 0.8034188034188035, 'recall': 0.8068669527896996, 'f1': 0.8051391862955032, 'number': 233} | 0.8703            | 0.9042         | 0.8869     | 0.9600           |
| 0.077         | 4.0   | 1868 | 0.1431          | {'precision': 0.8666666666666667, 'recall': 0.33620689655172414, 'f1': 0.484472049689441, 'number': 116} | {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34} | {'precision': 0.87109375, 'recall': 0.9139344262295082, 'f1': 0.892, 'number': 488}                      | {'precision': 0.8972972972972973, 'recall': 0.8177339901477833, 'f1': 0.8556701030927836, 'number': 203} | {'precision': 0.8969072164948454, 'recall': 0.9908883826879271, 'f1': 0.9415584415584417, 'number': 878} | {'precision': 0.6175637393767706, 'recall': 0.9356223175965666, 'f1': 0.7440273037542662, 'number': 233} | 0.8403            | 0.9057         | 0.8718     | 0.9591           |
| 0.053         | 5.0   | 2335 | 0.1367          | {'precision': 0.7065217391304348, 'recall': 0.5603448275862069, 'f1': 0.625, 'number': 116}              | {'precision': 0.8181818181818182, 'recall': 0.7941176470588235, 'f1': 0.8059701492537314, 'number': 34} | {'precision': 0.8993963782696177, 'recall': 0.9159836065573771, 'f1': 0.9076142131979696, 'number': 488} | {'precision': 0.8440860215053764, 'recall': 0.7733990147783252, 'f1': 0.8071979434447302, 'number': 203} | {'precision': 0.9205508474576272, 'recall': 0.989749430523918, 'f1': 0.9538968166849615, 'number': 878}  | {'precision': 0.7114093959731543, 'recall': 0.9098712446351931, 'f1': 0.7984934086629002, 'number': 233} | 0.8668            | 0.9103         | 0.8881     | 0.9625           |
| 0.0404        | 6.0   | 2802 | 0.1269          | {'precision': 0.7959183673469388, 'recall': 0.6724137931034483, 'f1': 0.7289719626168225, 'number': 116} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.9168399168399168, 'recall': 0.9036885245901639, 'f1': 0.9102167182662538, 'number': 488} | {'precision': 0.8967391304347826, 'recall': 0.812807881773399, 'f1': 0.8527131782945736, 'number': 203}  | {'precision': 0.9555555555555556, 'recall': 0.979498861047836, 'f1': 0.9673790776152982, 'number': 878}  | {'precision': 0.7619047619047619, 'recall': 0.8927038626609443, 'f1': 0.8221343873517787, 'number': 233} | 0.9010            | 0.9134         | 0.9071     | 0.9664           |
| 0.0232        | 7.0   | 3269 | 0.1361          | {'precision': 0.7818181818181819, 'recall': 0.7413793103448276, 'f1': 0.7610619469026548, 'number': 116} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.896, 'recall': 0.9180327868852459, 'f1': 0.9068825910931175, 'number': 488}              | {'precision': 0.875, 'recall': 0.8620689655172413, 'f1': 0.8684863523573201, 'number': 203}              | {'precision': 0.9505494505494505, 'recall': 0.9851936218678815, 'f1': 0.9675615212527963, 'number': 878} | {'precision': 0.8504273504273504, 'recall': 0.8540772532188842, 'f1': 0.8522483940042828, 'number': 233} | 0.9034            | 0.9242         | 0.9136     | 0.9693           |
| 0.0192        | 8.0   | 3736 | 0.1610          | {'precision': 0.7478991596638656, 'recall': 0.7672413793103449, 'f1': 0.7574468085106383, 'number': 116} | {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34} | {'precision': 0.9102040816326531, 'recall': 0.9139344262295082, 'f1': 0.9120654396728016, 'number': 488} | {'precision': 0.8442211055276382, 'recall': 0.8275862068965517, 'f1': 0.8358208955223881, 'number': 203} | {'precision': 0.9361471861471862, 'recall': 0.9851936218678815, 'f1': 0.9600443951165373, 'number': 878} | {'precision': 0.8326359832635983, 'recall': 0.8540772532188842, 'f1': 0.8432203389830507, 'number': 233} | 0.8935            | 0.9201         | 0.9066     | 0.9661           |
| 0.01          | 9.0   | 4203 | 0.1725          | {'precision': 0.7368421052631579, 'recall': 0.603448275862069, 'f1': 0.6635071090047393, 'number': 116}  | {'precision': 0.7045454545454546, 'recall': 0.9117647058823529, 'f1': 0.794871794871795, 'number': 34}  | {'precision': 0.9087221095334685, 'recall': 0.9180327868852459, 'f1': 0.9133537206931702, 'number': 488} | {'precision': 0.8613861386138614, 'recall': 0.8571428571428571, 'f1': 0.8592592592592593, 'number': 203} | {'precision': 0.9261241970021413, 'recall': 0.9851936218678815, 'f1': 0.9547461368653422, 'number': 878} | {'precision': 0.8326359832635983, 'recall': 0.8540772532188842, 'f1': 0.8432203389830507, 'number': 233} | 0.8904            | 0.9155         | 0.9028     | 0.9658           |
| 0.0082        | 10.0  | 4670 | 0.1772          | {'precision': 0.7345132743362832, 'recall': 0.7155172413793104, 'f1': 0.7248908296943231, 'number': 116} | {'precision': 0.7380952380952381, 'recall': 0.9117647058823529, 'f1': 0.8157894736842106, 'number': 34} | {'precision': 0.9118852459016393, 'recall': 0.9118852459016393, 'f1': 0.9118852459016393, 'number': 488} | {'precision': 0.8514851485148515, 'recall': 0.8472906403940886, 'f1': 0.8493827160493828, 'number': 203} | {'precision': 0.9291084854994629, 'recall': 0.9851936218678815, 'f1': 0.9563294637921502, 'number': 878} | {'precision': 0.8259109311740891, 'recall': 0.8755364806866953, 'f1': 0.85, 'number': 233}               | 0.8898            | 0.9221         | 0.9057     | 0.9665           |


### Framework versions

- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1