Update README.md
Browse files
README.md
CHANGED
@@ -22,7 +22,7 @@ tags:
|
|
22 |
[![Twitter](https://img.shields.io/twitter/follow/PrunaAI?style=social)](https://twitter.com/PrunaAI)
|
23 |
[![GitHub](https://img.shields.io/github/followers/PrunaAI?label=Follow%20%40PrunaAI&style=social)](https://github.com/PrunaAI)
|
24 |
[![LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue)](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
|
25 |
-
[![Discord](https://img.shields.io/badge/Discord-Join%20Us-blue?style=social&logo=discord)](https://discord.gg/
|
26 |
|
27 |
## This repo contains GGUF versions of the winglian/Llama-3-8b-64k-PoSE model.
|
28 |
|
@@ -32,7 +32,7 @@ tags:
|
|
32 |
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
|
33 |
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
|
34 |
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
|
35 |
-
- Join Pruna AI community on Discord [here](https://discord.gg/
|
36 |
|
37 |
**Frequently Asked Questions**
|
38 |
- ***How does the compression work?*** The model is compressed with GGUF.
|
|
|
22 |
[![Twitter](https://img.shields.io/twitter/follow/PrunaAI?style=social)](https://twitter.com/PrunaAI)
|
23 |
[![GitHub](https://img.shields.io/github/followers/PrunaAI?label=Follow%20%40PrunaAI&style=social)](https://github.com/PrunaAI)
|
24 |
[![LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue)](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
|
25 |
+
[![Discord](https://img.shields.io/badge/Discord-Join%20Us-blue?style=social&logo=discord)](https://discord.gg/rskEr4BZJx)
|
26 |
|
27 |
## This repo contains GGUF versions of the winglian/Llama-3-8b-64k-PoSE model.
|
28 |
|
|
|
32 |
- Contact us and tell us which model to compress next [here](https://www.pruna.ai/contact).
|
33 |
- Request access to easily compress your *own* AI models [here](https://z0halsaff74.typeform.com/pruna-access?typeform-source=www.pruna.ai).
|
34 |
- Read the documentations to know more [here](https://pruna-ai-pruna.readthedocs-hosted.com/en/latest/)
|
35 |
+
- Join Pruna AI community on Discord [here](https://discord.gg/rskEr4BZJx) to share feedback/suggestions or get help.
|
36 |
|
37 |
**Frequently Asked Questions**
|
38 |
- ***How does the compression work?*** The model is compressed with GGUF.
|