library_name: transformers | |
license: other | |
license_name: qwen | |
license_link: https://huggingface.co/Qwen/Qwen2.5-72B-Instruct/blob/main/LICENSE | |
base_model: Sao10K/72B-Qwen2.5-Kunou-v1 | |
tags: | |
- generated_from_trainer | |
- exl2 | |
model-index: | |
- name: 72B-Qwen2.5-Kunou-v1 | |
results: [] | |
![Kunou](https://huggingface.co/Sao10K/72B-Qwen2.5-Kunou-v1/resolve/main/knn.png) | |
# 72B-Qwen2.5-Kunou-v1 - EXL2 7.0bpw | |
This is a 7.0bpw EXL2 quant of [Sao10K/72B-Qwen2.5-Kunou-v1](https://huggingface.co/Sao10K/72B-Qwen2.5-Kunou-v1) | |
Details about the model can be found at the above model page. | |
## EXL2 Version | |
These quants were made with exllamav2 version 0.2.4. Quants made on this version of EXL2 may not work on older versions of the exllamav2 library. | |
If you have problems loading these models, please update Text Generation WebUI to the latest version. | |
## Perplexity Scoring | |
Below are the perplexity scores for the EXL2 models. A lower score is better. | |
| Quant Level | Perplexity Score | | |
|-------------|------------------| | |
| 4.5 | 5.0995 | | |
| 4.0 | 5.1480 | | |
| 3.5 | 5.3055 | | |
| 3.0 | 5.6398 | | |
| 2.75 | 6.0205 | | |
| 2.5 | 6.5372 | | |
| 2.25 | 7.2350 | | |