Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference
abhi-mosaic commited on
Commit
6ca4d2a
1 Parent(s): 4e61cee

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -70,6 +70,7 @@ Although the model was trained with a sequence length of 2048, ALiBi enables use
70
  config = transformers.AutoConfig.from_pretrained('mosaicml/mpt-7b', trust_remote_code=True)
71
  config.update({"max_seq_len": 4096})
72
  model = transformers.AutoModelForCausalLM.from_pretrained('mosaicml/mpt-7b', config=config, trust_remote_code=True)
 
73
 
74
  This model was trained with the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
75
 
 
70
  config = transformers.AutoConfig.from_pretrained('mosaicml/mpt-7b', trust_remote_code=True)
71
  config.update({"max_seq_len": 4096})
72
  model = transformers.AutoModelForCausalLM.from_pretrained('mosaicml/mpt-7b', config=config, trust_remote_code=True)
73
+ ```
74
 
75
  This model was trained with the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
76