gpt4-alpaca-lora_mlp-65b / adapter_config.json

Commit History

10 epochs and rank set to 16
3d067d8

chtan commited on

Upload 3 files
a77eafc

chtan commited on