Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference
jacobfulano commited on
Commit
487de08
1 Parent(s): 3d5c293

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -1
README.md CHANGED
@@ -30,7 +30,7 @@ Apache-2.0 (commercial use permitted)
30
 
31
  * [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
32
  * [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
33
- * Questions: Feel free to contact us via the [MosaicML Community Slack](https://join.slack.com/t/mosaicml-community/shared_invite/zt-w0tiddn9-WGTlRpfjcO9J5jyrMub1dg)!
34
 
35
  ### Example Question/Instruction
36
 
@@ -112,6 +112,11 @@ While great efforts have been taken to clean the pretraining data, it is possibl
112
 
113
  This model was finetuned by Sam Havens and the MosaicML NLP team
114
 
 
 
 
 
 
115
  ## Citation
116
 
117
  Please cite this model using the following format:
 
30
 
31
  * [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
32
  * [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
33
+ * Questions: Feel free to contact us via the [MosaicML Community Slack](https://join.slack.com/t/mosaicml-community/shared_invite/zt-1btms90mc-GipE2ufuPkKY0QBrmF3LSA)!
34
 
35
  ### Example Question/Instruction
36
 
 
112
 
113
  This model was finetuned by Sam Havens and the MosaicML NLP team
114
 
115
+ ## MosaicML Platform
116
+
117
+ If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo).
118
+
119
+
120
  ## Citation
121
 
122
  Please cite this model using the following format: