kartikmosaicml commited on
Commit
c36667f
1 Parent(s): c41b073

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -13
README.md CHANGED
@@ -52,28 +52,29 @@ _CC-By-NC-SA-4.0_ (non-commercial use only)
52
  ### Example Dialogue
53
 
54
  **ksreenivasan**:
55
- I want you to come up with a tweet based on this summary of the article:
56
- >We are excited to extend the MosaicML Foundation Series with MPT-30B, a new commercially licensed open-source model that is significantly more powerful, and outperforms the
57
- >original GPT-3. In addition, we are releasing two fine-tuned models, MPT-30B-Instruct and MPT-30B-Chat, that are built on top of MPT-30B and excel at short-form instruction
58
- >following and multi-turn conversations, respectively.
59
- >All MPT-30B models come with special features that differentiate them from other LLMs, including an 8k token context window (which can be further extended via finetuning;
60
- >see MPT-7B-StoryWriter), support for context-length extrapolation via ALiBi, and efficient inference + training performance via FlashAttention. It also has strong coding
61
- >abilities thanks to its pretraining data mixture.
 
62
 
63
 
64
 
65
  **MPT-30B-Chat**:
66
- >"Revolutionize your AI capabilities with MosaicML's latest addition, MPT-30B! Outperforming GPT-3, this open-source model is significantly more powerful and comes with
67
- >finely-tuned models for exceptional instruction following and multi-turn conversations. Unleash the potential of an 8k token context window, ALiBi support, and FlashAttention.
68
- >Experience the future of AI with MosaicML! #AI #ML #innovation"
69
 
70
  **ksreenivasan**:
71
- >Is there anything you want to say to MosaicML about this exciting news?
72
 
73
  **MPT-30B-Chat**:
74
  > As your AI marketing assistant, I would say that this is truly exciting news for the AI community! MosaicML's latest addition, MPT-30B, is poised to revolutionize the industry with its powerful open-source capabilities and finely-tuned models for instruction following and multi-turn conversations.
75
- >The special features, including the 8k token context window, ALiBi support, and FlashAttention, are sure to set MosaicML's MPT-30B apart from other LLMs.
76
- >Congratulations to the MosaicML team on this incredible achievement!
77
 
78
  ## How to Use
79
 
 
52
  ### Example Dialogue
53
 
54
  **ksreenivasan**:
55
+ > I want you to come up with a tweet based on this summary of the article:
56
+ >
57
+ > We are excited to extend the MosaicML Foundation Series with MPT-30B, a new commercially licensed open-source model that is significantly more powerful, and outperforms the
58
+ > original GPT-3. In addition, we are releasing two fine-tuned models, MPT-30B-Instruct and MPT-30B-Chat, that are built on top of MPT-30B and excel at short-form instruction
59
+ > following and multi-turn conversations, respectively.
60
+ > All MPT-30B models come with special features that differentiate them from other LLMs, including an 8k token context window (which can be further extended via finetuning;
61
+ > see MPT-7B-StoryWriter), support for context-length extrapolation via ALiBi, and efficient inference + training performance via FlashAttention. It also has strong coding
62
+ > abilities thanks to its pretraining data mixture.
63
 
64
 
65
 
66
  **MPT-30B-Chat**:
67
+ > "Revolutionize your AI capabilities with MosaicML's latest addition, MPT-30B! Outperforming GPT-3, this open-source model is significantly more powerful and comes with
68
+ > finely-tuned models for exceptional instruction following and multi-turn conversations. Unleash the potential of an 8k token context window, ALiBi support, and FlashAttention.
69
+ > Experience the future of AI with MosaicML! #AI #ML #innovation"
70
 
71
  **ksreenivasan**:
72
+ > Is there anything you want to say to MosaicML about this exciting news?
73
 
74
  **MPT-30B-Chat**:
75
  > As your AI marketing assistant, I would say that this is truly exciting news for the AI community! MosaicML's latest addition, MPT-30B, is poised to revolutionize the industry with its powerful open-source capabilities and finely-tuned models for instruction following and multi-turn conversations.
76
+ > The special features, including the 8k token context window, ALiBi support, and FlashAttention, are sure to set MosaicML's MPT-30B apart from other LLMs.
77
+ > Congratulations to the MosaicML team on this incredible achievement!
78
 
79
  ## How to Use
80