Update README.md
#32
by
Ziyang
- opened
README.md
CHANGED
@@ -33,7 +33,7 @@ This is the Full-Weight of WizardCoder.
|
|
33 |
|
34 |
|
35 |
<p align="center">
|
36 |
-
π€ <a href="https://huggingface.co/WizardLM" target="_blank">HF Repo</a> β’ π¦ <a href="https://twitter.com/WizardLM_AI" target="_blank">Twitter</a> β’ π <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> β’ π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a>
|
37 |
</p>
|
38 |
<p align="center">
|
39 |
π Join our <a href="https://discord.gg/VZjjHtWrKs" target="_blank">Discord</a>
|
@@ -48,6 +48,9 @@ This is the Full-Weight of WizardCoder.
|
|
48 |
| ----- |------| ---- |------|-------| ----- | ----- |
|
49 |
| WizardCoder-Python-34B-V1.0 | π€ <a href="https://huggingface.co/WizardLM/WizardCoder-Python-34B-V1.0" target="_blank">HF Link</a> | π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 73.2 | 61.2 | [Demo](http://47.103.63.15:50085/) | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> |
|
50 |
| WizardCoder-15B-V1.0 | π€ <a href="https://huggingface.co/WizardLM/WizardCoder-15B-V1.0" target="_blank">HF Link</a> | π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 59.8 |50.6 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
|
|
|
|
|
|
|
51 |
|
52 |
- π₯ [08/11/2023] We release **WizardMath** Models.
|
53 |
- π₯ Our **WizardMath-70B-V1.0** model slightly outperforms some closed-source LLMs on the GSM8K, including **ChatGPT 3.5**, **Claude Instant 1** and **PaLM 2 540B**.
|
@@ -70,7 +73,6 @@ This is the Full-Weight of WizardCoder.
|
|
70 |
| <sup>WizardLM-30B-V1.0</sup> | <sup>π€ <a href="https://huggingface.co/WizardLM/WizardLM-30B-V1.0" target="_blank">HF Link</a></sup> | | <sup>7.01</sup> | | <sup>97.8% </sup> | <sup>37.8 pass@1</sup>| <sup>Non-commercial</sup> |
|
71 |
| <sup>WizardLM-13B-V1.0</sup> | <sup>π€ <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.0" target="_blank">HF Link</a> </sup> | | <sup>6.35</sup> | <sup>75.31%</sup> | <sup>89.1% </sup> |<sup> 24.0 pass@1 </sup> | <sup>Non-commercial</sup>|
|
72 |
| <sup>WizardLM-7B-V1.0 </sup>| <sup>π€ <a href="https://huggingface.co/WizardLM/WizardLM-7B-V1.0" target="_blank">HF Link</a> </sup> |<sup> π <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> </sup>| | | <sup>78.0% </sup> |<sup>19.1 pass@1 </sup>|<sup> Non-commercial</sup>|
|
73 |
-
| <sup>WizardCoder-15B-V1.0</sup> | <sup> π€ <a href="https://huggingface.co/WizardLM/WizardCoder-15B-V1.0" target="_blank">HF Link</a></sup> | <sup>π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a></sup> | || |<sup> 57.3 pass@1 </sup> | <sup> <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a></sup> |
|
74 |
</font>
|
75 |
|
76 |
|
|
|
33 |
|
34 |
|
35 |
<p align="center">
|
36 |
+
π€ <a href="https://huggingface.co/WizardLM" target="_blank">HF Repo</a> β’π± <a href="https://github.com/nlpxucan/WizardLM" target="_blank">Github Repo</a> β’ π¦ <a href="https://twitter.com/WizardLM_AI" target="_blank">Twitter</a> β’ π <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> β’ π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> β’ π <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a> <br>
|
37 |
</p>
|
38 |
<p align="center">
|
39 |
π Join our <a href="https://discord.gg/VZjjHtWrKs" target="_blank">Discord</a>
|
|
|
48 |
| ----- |------| ---- |------|-------| ----- | ----- |
|
49 |
| WizardCoder-Python-34B-V1.0 | π€ <a href="https://huggingface.co/WizardLM/WizardCoder-Python-34B-V1.0" target="_blank">HF Link</a> | π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 73.2 | 61.2 | [Demo](http://47.103.63.15:50085/) | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> |
|
50 |
| WizardCoder-15B-V1.0 | π€ <a href="https://huggingface.co/WizardLM/WizardCoder-15B-V1.0" target="_blank">HF Link</a> | π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 59.8 |50.6 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
|
51 |
+
| WizardCoder-Python-13B-V1.0 | π€ <a href="https://huggingface.co/WizardLM/WizardCoder-Python-13B-V1.0" target="_blank">HF Link</a> | π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 64.0 | 55.6 | -- | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> |
|
52 |
+
| WizardCoder-3B-V1.0 | π€ <a href="https://huggingface.co/WizardLM/WizardCoder-3B-V1.0" target="_blank">HF Link</a> | π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 34.8 |37.4 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
|
53 |
+
| WizardCoder-1B-V1.0 | π€ <a href="https://huggingface.co/WizardLM/WizardCoder-1B-V1.0" target="_blank">HF Link</a> | π <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 23.8 |28.6 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
|
54 |
|
55 |
- π₯ [08/11/2023] We release **WizardMath** Models.
|
56 |
- π₯ Our **WizardMath-70B-V1.0** model slightly outperforms some closed-source LLMs on the GSM8K, including **ChatGPT 3.5**, **Claude Instant 1** and **PaLM 2 540B**.
|
|
|
73 |
| <sup>WizardLM-30B-V1.0</sup> | <sup>π€ <a href="https://huggingface.co/WizardLM/WizardLM-30B-V1.0" target="_blank">HF Link</a></sup> | | <sup>7.01</sup> | | <sup>97.8% </sup> | <sup>37.8 pass@1</sup>| <sup>Non-commercial</sup> |
|
74 |
| <sup>WizardLM-13B-V1.0</sup> | <sup>π€ <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.0" target="_blank">HF Link</a> </sup> | | <sup>6.35</sup> | <sup>75.31%</sup> | <sup>89.1% </sup> |<sup> 24.0 pass@1 </sup> | <sup>Non-commercial</sup>|
|
75 |
| <sup>WizardLM-7B-V1.0 </sup>| <sup>π€ <a href="https://huggingface.co/WizardLM/WizardLM-7B-V1.0" target="_blank">HF Link</a> </sup> |<sup> π <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> </sup>| | | <sup>78.0% </sup> |<sup>19.1 pass@1 </sup>|<sup> Non-commercial</sup>|
|
|
|
76 |
</font>
|
77 |
|
78 |
|