Could you share the training config?

#4
by Lohse - opened

I'm conducting a finetuning on alpaca-gpt4, could you share the training config (training script), like
batchsize
lr
lr_scheduler_type
seq_length
lora_rank
, and how to preprocess the input?
:)

Hi @Lohse ,

I lost the training script but here is the parameters:

lr: 0.0004
lr_scheduler_type: cosine
seq_length: 8192
lora_rank: 8

Sign up or log in to comment