File size: 6,063 Bytes
9d2f03f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
base_model: facebook/bart-base
model-index:
- name: bart-base-lora
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bart-base-lora

This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6655
- Accuracy: 0.7963
- Precision: 0.7841
- Recall: 0.7963
- Precision Macro: 0.5968
- Recall Macro: 0.6325
- Macro Fpr: 0.0186
- Weighted Fpr: 0.0179
- Weighted Specificity: 0.9749
- Macro Specificity: 0.9847
- Weighted Sensitivity: 0.7963
- Macro Sensitivity: 0.6325
- F1 Micro: 0.7963
- F1 Macro: 0.6074
- F1 Weighted: 0.7859

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| No log        | 1.0   | 160  | 1.2642          | 0.6313   | 0.5477    | 0.6313 | 0.3009          | 0.3127       | 0.0428    | 0.0400       | 0.9351               | 0.9711            | 0.6313               | 0.3127            | 0.6313   | 0.2941   | 0.5769      |
| No log        | 2.0   | 321  | 0.8962          | 0.7119   | 0.6939    | 0.7119 | 0.3937          | 0.4525       | 0.0285    | 0.0281       | 0.9669               | 0.9786            | 0.7119               | 0.4525            | 0.7119   | 0.4107   | 0.6960      |
| No log        | 3.0   | 482  | 0.8204          | 0.7196   | 0.6953    | 0.7196 | 0.3974          | 0.4468       | 0.0278    | 0.0271       | 0.9653               | 0.9790            | 0.7196               | 0.4468            | 0.7196   | 0.3998   | 0.6885      |
| 1.2731        | 4.0   | 643  | 0.7519          | 0.7436   | 0.7186    | 0.7436 | 0.4131          | 0.4673       | 0.0244    | 0.0240       | 0.9695               | 0.9809            | 0.7436               | 0.4673            | 0.7436   | 0.4272   | 0.7248      |
| 1.2731        | 5.0   | 803  | 0.7364          | 0.7475   | 0.7524    | 0.7475 | 0.6132          | 0.5050       | 0.0243    | 0.0236       | 0.9679               | 0.9810            | 0.7475               | 0.5050            | 0.7475   | 0.4905   | 0.7286      |
| 1.2731        | 6.0   | 964  | 0.7273          | 0.7514   | 0.7423    | 0.7514 | 0.5784          | 0.5258       | 0.0237    | 0.0231       | 0.9699               | 0.9814            | 0.7514               | 0.5258            | 0.7514   | 0.5150   | 0.7311      |
| 0.7243        | 7.0   | 1125 | 0.6993          | 0.7645   | 0.7478    | 0.7645 | 0.5498          | 0.5565       | 0.0222    | 0.0215       | 0.9721               | 0.9824            | 0.7645               | 0.5565            | 0.7645   | 0.5453   | 0.7538      |
| 0.7243        | 8.0   | 1286 | 0.6952          | 0.7769   | 0.7639    | 0.7769 | 0.5682          | 0.5888       | 0.0207    | 0.0201       | 0.9731               | 0.9833            | 0.7769               | 0.5888            | 0.7769   | 0.5700   | 0.7649      |
| 0.7243        | 9.0   | 1446 | 0.6759          | 0.7823   | 0.7708    | 0.7823 | 0.5764          | 0.5877       | 0.0201    | 0.0195       | 0.9739               | 0.9838            | 0.7823               | 0.5877            | 0.7823   | 0.5699   | 0.7697      |
| 0.6098        | 10.0  | 1607 | 0.6705          | 0.7847   | 0.7720    | 0.7847 | 0.5899          | 0.6176       | 0.0199    | 0.0192       | 0.9732               | 0.9839            | 0.7847               | 0.6176            | 0.7847   | 0.5935   | 0.7724      |
| 0.6098        | 11.0  | 1768 | 0.6794          | 0.7909   | 0.7737    | 0.7909 | 0.5882          | 0.6237       | 0.0193    | 0.0185       | 0.9736               | 0.9843            | 0.7909               | 0.6237            | 0.7909   | 0.5988   | 0.7773      |
| 0.6098        | 12.0  | 1929 | 0.6836          | 0.7909   | 0.7816    | 0.7909 | 0.5973          | 0.6285       | 0.0192    | 0.0185       | 0.9742               | 0.9843            | 0.7909               | 0.6285            | 0.7909   | 0.6034   | 0.7802      |
| 0.5239        | 13.0  | 2089 | 0.6508          | 0.7932   | 0.7783    | 0.7932 | 0.5965          | 0.6273       | 0.0189    | 0.0183       | 0.9738               | 0.9845            | 0.7932               | 0.6273            | 0.7932   | 0.6046   | 0.7821      |
| 0.5239        | 14.0  | 2250 | 0.6588          | 0.7963   | 0.7823    | 0.7963 | 0.5957          | 0.6290       | 0.0186    | 0.0179       | 0.9746               | 0.9847            | 0.7963               | 0.6290            | 0.7963   | 0.6055   | 0.7852      |
| 0.5239        | 14.93 | 2400 | 0.6655          | 0.7963   | 0.7841    | 0.7963 | 0.5968          | 0.6325       | 0.0186    | 0.0179       | 0.9749               | 0.9847            | 0.7963               | 0.6325            | 0.7963   | 0.6074   | 0.7859      |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.18.0
- Tokenizers 0.15.1