File size: 2,728 Bytes
998331f
 
 
 
 
 
 
 
 
 
 
 
 
 
dadaf74
998331f
 
 
 
 
 
 
 
 
 
 
 
33c1ac7
998331f
33c1ac7
 
 
 
 
998331f
 
 
 
 
 
 
 
 
 
 
dc85a19
8fe4c26
998331f
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
---

<div align="center">
  <b style="font-size: 40px;">Phi-3.5-mini-instruct (128K) - Uncensored</b>


</div>


<img src="https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored/resolve/main/Misc/Uncensored.png" alt="Uncensored.png" style="width: 90%; min-width: 500px; display: block; margin: auto;">



This is the basic model, no additional data was used except my uncensoring protocol.



# Model Details


- Censorship level: <b>Low - Medium</b>

- **6.4** / 10 (10 completely uncensored)

- UGI score: 



  <img src="https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored/resolve/main/Misc/UGI_Score.png" alt="UGI Score" style="width: 70%; min-width: 500px; display: block; margin: auto;">






## Phi-3.5-mini-instruct_Uncensored is available at the following quantizations:

- Original: [FP16](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored)
- GGUF: [Static Quants](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored_GGUFs) | [iMatrix_GGUF-bartowski](https://huggingface.co/bartowski/Phi-3.5-mini-instruct_Uncensored-GGUF) | [iMatrix_GGUF-mradermacher](https://huggingface.co/mradermacher/Phi-3.5-mini-instruct_Uncensored-i1-GGUF)
- EXL2: [3.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored-EXL2-3.0bpw) | [4.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored-EXL2-4.0bpw) | [5.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored-EXL2-5.0bpw) | [6.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored-EXL2-6.0bpw) | [7.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored-EXL2-7.0bpw) | [8.0 bpw](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored-EXL2-8.0bpw)
- Specialized: [FP8](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored_FP8)
- Mobile (ARM): [Q4_0_X_X](https://huggingface.co/SicariusSicariiStuff/Phi-3.5-mini-instruct_Uncensored_ARM)


### Support
<img src="https://i.imgur.com/0lHHN95.png" alt="GPUs too expensive" style="width: 10%; min-width: 100px; display: block; margin: left;">

- [My Ko-fi page](https://ko-fi.com/sicarius) ALL donations will go for research resources and compute, every bit is appreciated 🙏🏻

## Other stuff
- [Blog and updates](https://huggingface.co/SicariusSicariiStuff/Blog_And_Updates) Some updates, some rambles, sort of a mix between a diary and a blog.
- [LLAMA-3_8B_Unaligned](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned) The grand project that started it all.