ljw20180420's picture
Upload folder using huggingface_hub
e690791 verified
metadata
library_name: transformers
tags:
  - generated_from_trainer
datasets:
  - crispr_data
model-index:
  - name: SX_spymac_FOREcasT
    results: []

SX_spymac_FOREcasT

This model is a fine-tuned version of on the crispr_data dataset. It achieves the following results on the evaluation set:

  • Loss: 35.6392

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 100
  • eval_batch_size: 100
  • seed: 63036
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 30.0

Training results

Training Loss Epoch Step Validation Loss
5311.6051 1.0 327 4340.6602
2980.4667 2.0 654 1883.8451
1312.2241 3.0 981 878.0609
616.8953 4.0 1308 412.7102
289.3571 5.0 1635 196.8413
142.727 6.0 1962 103.9544
80.499 7.0 2289 64.8617
54.4604 8.0 2616 48.6726
43.3439 9.0 2943 41.4865
38.4873 10.0 3270 38.4845
36.2797 11.0 3597 37.1310
35.2948 12.0 3924 36.3968
34.8241 13.0 4251 36.1038
34.6104 14.0 4578 35.9149
34.5055 15.0 4905 35.8438
34.4508 16.0 5232 35.7904
34.4032 17.0 5559 35.7617
34.3773 18.0 5886 35.8614
34.3794 19.0 6213 35.7839
34.3675 20.0 6540 35.7271
34.351 21.0 6867 35.7075
34.3603 22.0 7194 35.6449
34.3472 23.0 7521 35.6552
34.3254 24.0 7848 35.6328
34.3245 25.0 8175 35.7084
34.3145 26.0 8502 35.6494
34.3083 27.0 8829 35.6973
34.297 28.0 9156 35.6497
34.2859 29.0 9483 35.6311
34.2783 30.0 9810 35.6392

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu124
  • Datasets 2.21.0
  • Tokenizers 0.19.1