llm_deploy_small / generation_config.json
Ozgur98's picture
Upload MosaicGPT
c125b82
raw
history blame
91 Bytes
{
"_from_model_config": true,
"transformers_version": "4.30.2",
"use_cache": false
}