Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference
sam-mosaic commited on
Commit
56bcbea
1 Parent(s): 2e70654

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -1,15 +1,15 @@
1
  ---
2
- license: cc-by-sa-3.0
3
  datasets:
4
  - competition_math
5
- - conceptofmind/cot_submix_original/cot_gsm8k
6
  - knkarthick/dialogsum
7
  - mosaicml/dolly_hhrlhf
8
  - duorc
9
- - tau/scrolls/qasper
10
  - emozilla/quality
11
  - scrolls/summ_screen_fd
12
  - spider
 
 
13
  tags:
14
  - Composer
15
  - MosaicML
@@ -21,7 +21,7 @@ inference: false
21
 
22
  MPT-30B-Instruct is a model for short-form instruction following.
23
  It is built by finetuning [MPT-30B](https://huggingface.co/mosaicml/mpt-30b) on [Dolly HHRLHF](https://huggingface.co/datasets/mosaicml/dolly_hhrlhf) derived from the [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and the [Anthropic Helpful and Harmless (HH-RLHF)](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets. It is also trained on [Competition Math](https://huggingface.co/datasets/competition_math), [Duorc](https://huggingface.co/datasets/duorc), [CoT GSM8k](https://huggingface.co/datasets/conceptofmind/cot_submix_original), [Qasper](https://huggingface.co/datasets/allenai/qasper), [Quality](https://huggingface.co/datasets/emozilla/quality), [Summ Screen FD](https://huggingface.co/datasets/tau/scrolls) and [Spider](https://huggingface.co/datasets/spider).
24
- * License: _CC-By-SA-3.0_
25
 
26
 
27
  This model was trained by [MosaicML](https://www.mosaicml.com) and follows a modified decoder-only transformer architecture.
@@ -32,7 +32,7 @@ June 22, 2023
32
 
33
  ## Model License
34
 
35
- CC-By-SA-3.0
36
 
37
  ## Documentation
38
 
 
1
  ---
2
+ license: apache-2.0
3
  datasets:
4
  - competition_math
 
5
  - knkarthick/dialogsum
6
  - mosaicml/dolly_hhrlhf
7
  - duorc
 
8
  - emozilla/quality
9
  - scrolls/summ_screen_fd
10
  - spider
11
+ - gsm8k
12
+ - allenai/qasper
13
  tags:
14
  - Composer
15
  - MosaicML
 
21
 
22
  MPT-30B-Instruct is a model for short-form instruction following.
23
  It is built by finetuning [MPT-30B](https://huggingface.co/mosaicml/mpt-30b) on [Dolly HHRLHF](https://huggingface.co/datasets/mosaicml/dolly_hhrlhf) derived from the [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and the [Anthropic Helpful and Harmless (HH-RLHF)](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets. It is also trained on [Competition Math](https://huggingface.co/datasets/competition_math), [Duorc](https://huggingface.co/datasets/duorc), [CoT GSM8k](https://huggingface.co/datasets/conceptofmind/cot_submix_original), [Qasper](https://huggingface.co/datasets/allenai/qasper), [Quality](https://huggingface.co/datasets/emozilla/quality), [Summ Screen FD](https://huggingface.co/datasets/tau/scrolls) and [Spider](https://huggingface.co/datasets/spider).
24
+ * License: Apache 2.0
25
 
26
 
27
  This model was trained by [MosaicML](https://www.mosaicml.com) and follows a modified decoder-only transformer architecture.
 
32
 
33
  ## Model License
34
 
35
+ Apache 2.0
36
 
37
  ## Documentation
38