gena-lm-bert-base-t2t_ft_BioS73_1kbpHG19_DHSs_H3K27AC
This model is a fine-tuned version of AIRI-Institute/gena-lm-bert-base-t2t on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5681
- F1 Score: 0.8606
- Precision: 0.8346
- Recall: 0.8883
- Accuracy: 0.8464
- Auc: 0.9057
- Prc: 0.8788
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Score | Precision | Recall | Accuracy | Auc | Prc |
---|---|---|---|---|---|---|---|---|---|
0.6857 | 0.1864 | 500 | 0.6036 | 0.7971 | 0.7225 | 0.8890 | 0.7585 | 0.8247 | 0.7929 |
0.5451 | 0.3727 | 1000 | 0.4777 | 0.8256 | 0.7793 | 0.8778 | 0.8021 | 0.8670 | 0.8418 |
0.4643 | 0.5591 | 1500 | 0.4579 | 0.8296 | 0.8228 | 0.8366 | 0.8166 | 0.8780 | 0.8550 |
0.4425 | 0.7454 | 2000 | 0.4760 | 0.8315 | 0.8307 | 0.8324 | 0.8200 | 0.8770 | 0.8551 |
0.4363 | 0.9318 | 2500 | 0.4510 | 0.8457 | 0.8088 | 0.8862 | 0.8274 | 0.8755 | 0.8489 |
0.4201 | 1.1182 | 3000 | 0.4354 | 0.8534 | 0.8062 | 0.9064 | 0.8338 | 0.8876 | 0.8626 |
0.4138 | 1.3045 | 3500 | 0.4798 | 0.8504 | 0.8004 | 0.9071 | 0.8297 | 0.8874 | 0.8611 |
0.4153 | 1.4909 | 4000 | 0.4520 | 0.8493 | 0.8177 | 0.8834 | 0.8327 | 0.9034 | 0.8996 |
0.3919 | 1.6772 | 4500 | 0.4782 | 0.8514 | 0.8192 | 0.8862 | 0.8349 | 0.8972 | 0.8699 |
0.3899 | 1.8636 | 5000 | 0.4710 | 0.8547 | 0.8058 | 0.9099 | 0.8349 | 0.8892 | 0.8532 |
0.3779 | 2.0499 | 5500 | 0.5085 | 0.8549 | 0.8151 | 0.8987 | 0.8371 | 0.8910 | 0.8629 |
0.4259 | 2.2363 | 6000 | 0.4850 | 0.8576 | 0.7992 | 0.9253 | 0.8360 | 0.8904 | 0.8549 |
0.3808 | 2.4227 | 6500 | 0.4826 | 0.8530 | 0.7793 | 0.9420 | 0.8267 | 0.8805 | 0.8371 |
0.3974 | 2.6090 | 7000 | 0.4625 | 0.8577 | 0.8101 | 0.9113 | 0.8386 | 0.9101 | 0.9002 |
0.3948 | 2.7954 | 7500 | 0.5013 | 0.8547 | 0.8243 | 0.8876 | 0.8390 | 0.9022 | 0.8830 |
0.3707 | 2.9817 | 8000 | 0.5439 | 0.8597 | 0.7881 | 0.9455 | 0.8353 | 0.8863 | 0.8461 |
0.3831 | 3.1681 | 8500 | 0.5381 | 0.8622 | 0.7938 | 0.9434 | 0.8390 | 0.8403 | 0.7716 |
0.3949 | 3.3545 | 9000 | 0.5503 | 0.8528 | 0.8219 | 0.8862 | 0.8367 | 0.9103 | 0.8892 |
0.405 | 3.5408 | 9500 | 0.5180 | 0.8596 | 0.8090 | 0.9169 | 0.8401 | 0.9103 | 0.8895 |
0.3759 | 3.7272 | 10000 | 0.5320 | 0.8581 | 0.8169 | 0.9036 | 0.8405 | 0.8719 | 0.8090 |
0.3821 | 3.9135 | 10500 | 0.5786 | 0.8533 | 0.8326 | 0.875 | 0.8394 | 0.8952 | 0.8614 |
0.3608 | 4.0999 | 11000 | 0.5681 | 0.8606 | 0.8346 | 0.8883 | 0.8464 | 0.9057 | 0.8788 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.3.0+cu121
- Datasets 2.18.0
- Tokenizers 0.19.0
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for tanoManzo/gena-lm-bert-base-t2t_ft_BioS73_1kbpHG19_DHSs_H3K27AC
Base model
AIRI-Institute/gena-lm-bert-base-t2t