Text Generation
Transformers
Safetensors
llama
Generated from Trainer
axolotl
conversational
Inference Endpoints
text-generation-inference
pabloce commited on
Commit
1ec5222
1 Parent(s): 3e1f88c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -23,6 +23,7 @@ This is our most spectacular outcome ever. FFT, all parameters, 16bit. 77.4 MML
23
 
24
  Although the max positional embeddings is 4k, we used rope theta of 1000000.0 and we trained with sequence length 8k. We plan to train on the upcoming 32k version as well.
25
 
 
26
  Discord: https://discord.gg/cognitivecomputations
27
 
28
  <img src="https://cdn-uploads.huggingface.co/production/uploads/63111b2d88942700629f5771/ldkN1J0WIDQwU4vutGYiD.png" width="600" />
 
23
 
24
  Although the max positional embeddings is 4k, we used rope theta of 1000000.0 and we trained with sequence length 8k. We plan to train on the upcoming 32k version as well.
25
 
26
+ [![Discord](https://img.shields.io/discord/1156064224225808488?logo=Discord&logoColor=%23ffffff&label=Discord&link=https%3A%2F%2Fdiscord.gg%2FtCMkMDDHwm)](https://discord.gg/cognitivecomputations)
27
  Discord: https://discord.gg/cognitivecomputations
28
 
29
  <img src="https://cdn-uploads.huggingface.co/production/uploads/63111b2d88942700629f5771/ldkN1J0WIDQwU4vutGYiD.png" width="600" />