Tucano-2b4-Instruct / evals-dpo.yaml

Commit History

Upload evals-dpo.yaml with huggingface_hub
ee42b7e
verified

nicholasKluge commited on