Spaces:
Running
Running
add paper link
Browse files
README.md
CHANGED
@@ -25,7 +25,7 @@ BigCode is an open scientific collaboration working on responsible training of l
|
|
25 |
StarCoder is a 15.5B parameters language model for code trained for 1T tokens on 80+ programming languages. It uses MQA for efficient generation, has 8,192 tokens context window and can do fill-in-the-middle.
|
26 |
|
27 |
### Models
|
28 |
-
- [Paper](): A technical report about StarCoder.
|
29 |
- [GitHub](https://github.com/bigcode-project/starcoder/tree/main): All you need to know about using or fine-tuning StarCoder.
|
30 |
- [StarCoder](https://huggingface.co/bigcode/starcoder): StarCoderBase further trained on Python.
|
31 |
- [StarCoderBase](https://huggingface.co/bigcode/starcoderbase): Trained on 80+ languages from The Stack.
|
|
|
25 |
StarCoder is a 15.5B parameters language model for code trained for 1T tokens on 80+ programming languages. It uses MQA for efficient generation, has 8,192 tokens context window and can do fill-in-the-middle.
|
26 |
|
27 |
### Models
|
28 |
+
- [Paper](https://drive.google.com/file/d/1cN-b9GnWtHzQRoE7M7gAEyivY0kl4BYs/view): A technical report about StarCoder.
|
29 |
- [GitHub](https://github.com/bigcode-project/starcoder/tree/main): All you need to know about using or fine-tuning StarCoder.
|
30 |
- [StarCoder](https://huggingface.co/bigcode/starcoder): StarCoderBase further trained on Python.
|
31 |
- [StarCoderBase](https://huggingface.co/bigcode/starcoderbase): Trained on 80+ languages from The Stack.
|