Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ inference: false
|
|
13 |
|
14 |
MPT-7B-Instruct is a model for short-form instruction following.
|
15 |
It is built by finetuning [MPT-7B](https://huggingface.co/spaces/mosaicml/mpt-7b) on a [dataset](https://huggingface.co/datasets/sam-mosaic/dolly_hhrlhf) derived from the [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and the [Anthropic Helpful and Harmless (HH-RLHF)](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets.
|
16 |
-
* License: _CC-By-SA-3.0_
|
17 |
* [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-7b-instruct)
|
18 |
|
19 |
|
@@ -25,7 +25,7 @@ May 5, 2023
|
|
25 |
|
26 |
## Model License
|
27 |
|
28 |
-
CC-By-SA-3.0
|
29 |
|
30 |
## Documentation
|
31 |
|
@@ -140,6 +140,9 @@ This model was finetuned by Sam Havens and the MosaicML NLP team
|
|
140 |
|
141 |
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b).
|
142 |
|
|
|
|
|
|
|
143 |
|
144 |
## Citation
|
145 |
|
|
|
13 |
|
14 |
MPT-7B-Instruct is a model for short-form instruction following.
|
15 |
It is built by finetuning [MPT-7B](https://huggingface.co/spaces/mosaicml/mpt-7b) on a [dataset](https://huggingface.co/datasets/sam-mosaic/dolly_hhrlhf) derived from the [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and the [Anthropic Helpful and Harmless (HH-RLHF)](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets.
|
16 |
+
* License: _CC-By-SA-3.0_
|
17 |
* [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-7b-instruct)
|
18 |
|
19 |
|
|
|
25 |
|
26 |
## Model License
|
27 |
|
28 |
+
CC-By-SA-3.0
|
29 |
|
30 |
## Documentation
|
31 |
|
|
|
140 |
|
141 |
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b).
|
142 |
|
143 |
+
## Disclaimer
|
144 |
+
|
145 |
+
The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please cosult an attorney before using this model for commercial purposes.
|
146 |
|
147 |
## Citation
|
148 |
|