ljw20180420's picture
Upload folder using huggingface_hub
f6fa04c verified
metadata
library_name: transformers
tags:
  - generated_from_trainer
datasets:
  - crispr_data
model-index:
  - name: SX_spcas9_FOREcasT
    results: []

SX_spcas9_FOREcasT

This model is a fine-tuned version of on the crispr_data dataset. It achieves the following results on the evaluation set:

  • Loss: 90.0301

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 100
  • eval_batch_size: 100
  • seed: 63036
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 30.0

Training results

Training Loss Epoch Step Validation Loss
5299.9422 1.0 322 4486.6816
3409.13 2.0 644 2367.0308
1740.73 3.0 966 1208.4238
886.9884 4.0 1288 619.2037
463.7353 5.0 1610 335.7322
264.2752 6.0 1932 205.3073
173.0323 7.0 2254 145.6962
131.4609 8.0 2576 117.9419
111.7286 9.0 2898 104.6322
102.0464 10.0 3220 97.9003
97.07 11.0 3542 94.3013
94.5021 12.0 3864 92.5289
93.0652 13.0 4186 91.5912
92.3189 14.0 4508 91.0249
91.8618 15.0 4830 90.6213
91.6092 16.0 5152 90.4249
91.4372 17.0 5474 90.2542
91.3401 18.0 5796 90.2745
91.2793 19.0 6118 90.1836
91.2196 20.0 6440 90.1465
91.1831 21.0 6762 90.0652
91.1484 22.0 7084 90.1792
91.1333 23.0 7406 90.0813
91.1064 24.0 7728 90.1987
91.09 25.0 8050 90.0518
91.0658 26.0 8372 90.0653
91.0503 27.0 8694 90.0538
91.0277 28.0 9016 90.0120
91.013 29.0 9338 90.0356
90.9967 30.0 9660 90.0301

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu124
  • Datasets 2.21.0
  • Tokenizers 0.19.1