Transformers
ctranslate2
int8
float16
Composer
MosaicML
llm-foundry
ct2fast-mpt-7b-chat / generation_config.json
michaelfeil's picture
Upload mosaicml/mpt-7b-chat ctranslate fp16 weights
cc8bde5
raw
history blame contribute delete
91 Bytes
{
"_from_model_config": true,
"transformers_version": "4.28.1",
"use_cache": false
}