File size: 4,370 Bytes
60eca51
 
 
 
 
 
 
37a3de8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
60eca51
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
---
license: other
datasets:
- mlabonne/orpo-dpo-mix-40k
tags:
- dpo
---
**Exllamav2** quant (**exl2** / **6.5 bpw**) made with ExLlamaV2 v0.1.1

Other EXL2 quants:
| **Quant** | **Model Size** | **lm_head** |
| ----- | ---------- | ------- |
|<center>**[2.2](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-2_2bpw_exl2)**</center> | <center>3250 MB</center> | <center>6</center> |
|<center>**[2.5](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-2_5bpw_exl2)**</center> | <center>3479 MB</center> | <center>6</center> |
|<center>**[3.0](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-3_0bpw_exl2)**</center> | <center>3895 MB</center> | <center>6</center> |
|<center>**[3.5](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-3_5bpw_exl2)**</center> | <center>4310 MB</center> | <center>6</center> |
|<center>**[3.75](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-3_75bpw_exl2)**</center> | <center>4519 MB</center> | <center>6</center> |
|<center>**[4.0](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-4_0bpw_exl2)**</center> | <center>4727 MB</center> | <center>6</center> |
|<center>**[4.25](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-4_25bpw_exl2)**</center> | <center>4931 MB</center> | <center>6</center> |
|<center>**[5.0](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-5_0bpw_exl2)**</center> | <center>5559 MB</center> | <center>6</center> |
|<center>**[6.0](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-6_0bpw_exl2)**</center> | <center>6495 MB</center> | <center>8</center> |
|<center>**[6.5](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-6_5bpw_exl2)**</center> | <center>6903 MB</center> | <center>8</center> |
|<center>**[8.0](https://huggingface.co/Zoyd/mlabonne_NeuralDaredevil-8B-abliterated-8_0bpw_exl2)**</center> | <center>8157 MB</center> | <center>8</center> |

# NeuralDaredevil-8B-abliterated

![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/61b8e2ba285851687028d395/gFEhcIDSKa3AWpkNfH91q.jpeg)

This is a DPO fine-tune of [mlabonne/Daredevil-8-abliterated](https://huggingface.co/mlabonne/Daredevil-8B-abliterated) trained on one epoch of [mlabonne/orpo-dpo-mix-40k](https://huggingface.co/datasets/mlabonne/orpo-dpo-mix-40k).

## πŸ† Evaluation

### Open LLM Leaderboard

TBD.

### Nous

Evaluation performed using [LLM AutoEval](https://github.com/mlabonne/llm-autoeval). See the entire leaderboard [here](https://huggingface.co/spaces/mlabonne/Yet_Another_LLM_Leaderboard).

| Model | Average | AGIEval | GPT4All | TruthfulQA | Bigbench |
|---|---:|---:|---:|---:|---:|
| [**mlabonne/NeuralDaredevil-8B-abliterated**](https://huggingface.co/mlabonne/NeuralDaredevil-8B-abliterated) [πŸ“„](https://gist.github.com/mlabonne/ae0bf16936cef900b72964b33c99edbc) | **55.87** | **43.73** | **73.6** | **59.36** | **46.8** |
| [mlabonne/Daredevil-8B](https://huggingface.co/mlabonne/Daredevil-8B) [πŸ“„](https://gist.github.com/mlabonne/080f9c5f153ea57a7ab7d932cf896f21) | 55.87 | 44.13 | 73.52 | 59.05 | 46.77 |
| [mlabonne/Daredevil-8B-abliterated](https://huggingface.co/mlabonne/Daredevil-8B-abliterated) [πŸ“„](https://gist.github.com/mlabonne/32cdd8460804662c856bcb2a20acd49e) | 55.06 | 43.29 | 73.33 | 57.47 | 46.17 |
| [NousResearch/Hermes-2-Theta-Llama-3-8B](https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B) [πŸ“„](https://gist.github.com/mlabonne/5df2a3051dd6eb3368a77b684635dc05) | 54.28 | 43.9 | 72.62 | 56.36 | 44.23 |
| [openchat/openchat-3.6-8b-20240522](https://huggingface.co/openchat/openchat-3.6-8b-20240522) [πŸ“„](https://gist.github.com/mlabonne/95eef8e8d26b7b17910dcb78e1c95f4a) | 53.49 | 44.03 | 73.67 | 49.78 | 46.48 |
| [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) [πŸ“„](https://gist.github.com/mlabonne/8329284d86035e6019edb11eb0933628) | 51.34 | 41.22 | 69.86 | 51.65 | 42.64 |
| [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) [πŸ“„](https://gist.github.com/mlabonne/616b6245137a9cfc4ea80e4c6e55d847) | 45.42 | 31.1 | 69.95 | 43.91 | 36.7 |


## 🌳 Model family tree

![image/png](https://cdn-uploads.huggingface.co/production/uploads/61b8e2ba285851687028d395/ekwRGgnjzEOyprT8sEBFt.png)