Tucano-1b1-Instruct / evals-dpo.yaml

Commit History

Upload evals-dpo.yaml with huggingface_hub
d7744f9
verified

nicholasKluge commited on