Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
@@ -17,9 +17,9 @@ Welcome to the GPU-Poor LLM Gladiator Arena, where frugal meets fabulous in the
|
|
17 |
|
18 |
## π€ Starting from "Why?"
|
19 |
|
20 |
-
In the recent months,
|
21 |
|
22 |
-
- **Gradio Exploration**: This project serves me as a playground for experimenting with Gradio app development
|
23 |
|
24 |
- **Tiny Model Evaluation**: I wanted to develop a personal (and now public) stats system for evaluating tiny language models. It's not too serious, but it provides valuable insights into the capabilities of these compact powerhouses.
|
25 |
|
@@ -49,8 +49,8 @@ In the recent months, We've seen a lot of these "Tiny" models released, and some
|
|
49 |
|
50 |
1. Clone the repository:
|
51 |
```
|
52 |
-
git clone https://
|
53 |
-
cd gpu-poor-llm-
|
54 |
```
|
55 |
|
56 |
2. Install the required packages:
|
@@ -102,7 +102,7 @@ The arena currently supports various compact models, including:
|
|
102 |
|
103 |
## π€ Contributing
|
104 |
|
105 |
-
Contributions are welcome!
|
106 |
|
107 |
## π License
|
108 |
|
@@ -111,6 +111,6 @@ This project is open-source and available under the MIT License
|
|
111 |
## π Acknowledgements
|
112 |
|
113 |
- Thanks to the Ollama team for providing that amazing tool.
|
114 |
-
- Shoutout to all the AI researchers and compact language models teams
|
115 |
|
116 |
Enjoy the battles in the GPU-Poor LLM Gladiator Arena! May the best compact model win! π
|
|
|
17 |
|
18 |
## π€ Starting from "Why?"
|
19 |
|
20 |
+
In the recent months, we've seen a lot of these "Tiny" models released, and some of them are really impressive.
|
21 |
|
22 |
+
- **Gradio Exploration**: This project serves me as a playground for experimenting with Gradio app development; I am learning how to create interactive AI interfaces with it.
|
23 |
|
24 |
- **Tiny Model Evaluation**: I wanted to develop a personal (and now public) stats system for evaluating tiny language models. It's not too serious, but it provides valuable insights into the capabilities of these compact powerhouses.
|
25 |
|
|
|
49 |
|
50 |
1. Clone the repository:
|
51 |
```
|
52 |
+
git clone https://huggingface.co/spaces/k-mktr/gpu-poor-llm-arena.git
|
53 |
+
cd gpu-poor-llm-arena
|
54 |
```
|
55 |
|
56 |
2. Install the required packages:
|
|
|
102 |
|
103 |
## π€ Contributing
|
104 |
|
105 |
+
Contributions are welcome! Please feel free to suggest a model that Ollama supports. Some results are already quite surprising.
|
106 |
|
107 |
## π License
|
108 |
|
|
|
111 |
## π Acknowledgements
|
112 |
|
113 |
- Thanks to the Ollama team for providing that amazing tool.
|
114 |
+
- Shoutout to all the AI researchers and compact language models teams for making this frugal AI arena possible!
|
115 |
|
116 |
Enjoy the battles in the GPU-Poor LLM Gladiator Arena! May the best compact model win! π
|