Neko-Institute-of-Science's picture
add info
a119e81
|
raw
history blame
682 Bytes
metadata
datasets:
  - alpindale/light-novels

I am training on this dataset as a test. Well a 6 day test RIP my A6000.

{"lora_name": "LightNovels", 
"always_override": false, 
"save_steps": 400.0, 
"micro_batch_size": 4, 
"batch_size": 128, 
"epochs": 1.0, 
"learning_rate": "3e-4", 
"lr_scheduler_type": "constant", 
"lora_rank": 256, 
"lora_alpha": 512, 
"lora_dropout": 0.05, 
"cutoff_len": 2048, 
"dataset": "None", 
"eval_dataset": "None", 
"format": "None", 
"eval_steps": 100.0, 
"raw_text_file": "LightNovels_clean", 
"overlap_len": 512, 
"newline_favor_len": 512, 
"do_shuffle": true, 
"higher_rank_limit": false, 
"warmup_steps": 100.0, 
"optimizer": "adamw_torch"}