RichardErkhov's picture
uploaded readme
510b36e verified
Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
Umbra-v3-MoE-4x11b-2ex - GGUF
- Model creator: https://huggingface.co/Steelskull/
- Original model: https://huggingface.co/Steelskull/Umbra-v3-MoE-4x11b-2ex/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [Umbra-v3-MoE-4x11b-2ex.Q2_K.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q2_K.gguf) | Q2_K | 12.28GB |
| [Umbra-v3-MoE-4x11b-2ex.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.IQ3_XS.gguf) | IQ3_XS | 13.74GB |
| [Umbra-v3-MoE-4x11b-2ex.IQ3_S.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.IQ3_S.gguf) | IQ3_S | 14.52GB |
| [Umbra-v3-MoE-4x11b-2ex.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q3_K_S.gguf) | Q3_K_S | 14.5GB |
| [Umbra-v3-MoE-4x11b-2ex.IQ3_M.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.IQ3_M.gguf) | IQ3_M | 14.8GB |
| [Umbra-v3-MoE-4x11b-2ex.Q3_K.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q3_K.gguf) | Q3_K | 16.1GB |
| [Umbra-v3-MoE-4x11b-2ex.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q3_K_M.gguf) | Q3_K_M | 16.1GB |
| [Umbra-v3-MoE-4x11b-2ex.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q3_K_L.gguf) | Q3_K_L | 17.45GB |
| [Umbra-v3-MoE-4x11b-2ex.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.IQ4_XS.gguf) | IQ4_XS | 18.13GB |
| [Umbra-v3-MoE-4x11b-2ex.Q4_0.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q4_0.gguf) | Q4_0 | 18.95GB |
| [Umbra-v3-MoE-4x11b-2ex.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.IQ4_NL.gguf) | IQ4_NL | 19.13GB |
| [Umbra-v3-MoE-4x11b-2ex.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q4_K_S.gguf) | Q4_K_S | 19.11GB |
| [Umbra-v3-MoE-4x11b-2ex.Q4_K.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q4_K.gguf) | Q4_K | 20.32GB |
| [Umbra-v3-MoE-4x11b-2ex.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q4_K_M.gguf) | Q4_K_M | 20.32GB |
| [Umbra-v3-MoE-4x11b-2ex.Q4_1.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q4_1.gguf) | Q4_1 | 21.04GB |
| [Umbra-v3-MoE-4x11b-2ex.Q5_0.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q5_0.gguf) | Q5_0 | 23.13GB |
| [Umbra-v3-MoE-4x11b-2ex.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q5_K_S.gguf) | Q5_K_S | 23.13GB |
| [Umbra-v3-MoE-4x11b-2ex.Q5_K.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q5_K.gguf) | Q5_K | 23.84GB |
| [Umbra-v3-MoE-4x11b-2ex.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q5_K_M.gguf) | Q5_K_M | 23.84GB |
| [Umbra-v3-MoE-4x11b-2ex.Q5_1.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q5_1.gguf) | Q5_1 | 25.23GB |
| [Umbra-v3-MoE-4x11b-2ex.Q6_K.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q6_K.gguf) | Q6_K | 27.58GB |
| [Umbra-v3-MoE-4x11b-2ex.Q8_0.gguf](https://huggingface.co/RichardErkhov/Steelskull_-_Umbra-v3-MoE-4x11b-2ex-gguf/blob/main/Umbra-v3-MoE-4x11b-2ex.Q8_0.gguf) | Q8_0 | 35.72GB |
Original model description:
---
license: apache-2.0
tags:
- moe
- frankenmoe
- merge
- mergekit
- Himitsui/Kaiju-11B
- Sao10K/Fimbulvetr-11B-v2
- decapoda-research/Antares-11b-v2
- beberik/Nyxene-v3-11B
base_model:
- Himitsui/Kaiju-11B
- Sao10K/Fimbulvetr-11B-v2
- decapoda-research/Antares-11b-v2
- beberik/Nyxene-v3-11B
---
<!DOCTYPE html>
<style>
body {
font-family: 'Quicksand', sans-serif;
background: linear-gradient(135deg, #2E3440 0%, #1A202C 100%);
color: #D8DEE9;
margin: 0;
padding: 0;
font-size: 16px;
}
.container {
width: 80%;
max-width: 800px;
margin: 20px auto;
background-color: rgba(255, 255, 255, 0.02);
padding: 20px;
border-radius: 12px;
box-shadow: 0 4px 10px rgba(0, 0, 0, 0.2);
backdrop-filter: blur(10px);
border: 1px solid rgba(255, 255, 255, 0.1);
}
.header h1 {
font-size: 28px;
color: #ECEFF4;
margin: 0 0 20px 0;
text-shadow: 2px 2px 4px rgba(0, 0, 0, 0.3);
}
.update-section {
margin-top: 30px;
}
.update-section h2 {
font-size: 24px;
color: #88C0D0;
}
.update-section p {
font-size: 16px;
line-height: 1.6;
color: #ECEFF4;
}
.info img {
width: 100%;
border-radius: 10px;
margin-bottom: 15px;
}
a {
color: #88C0D0;
text-decoration: none;
}
a:hover {
color: #A3BE8C;
}
.button {
display: inline-block;
background-color: #5E81AC;
color: #E5E9F0;
padding: 10px 20px;
border-radius: 5px;
cursor: pointer;
text-decoration: none;
}
.button:hover {
background-color: #81A1C1;
}
</style>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Umbra-v3-MoE-4x11b Data Card</title>
<link href="https://fonts.googleapis.com/css2?family=Quicksand:wght@400;500;600&display=swap" rel="stylesheet">
</head>
<body>
<div class="container">
<div class="header">
<h1>Umbra-v3-MoE-4x11b-2ex</h1>
</div>
<div class="info">
<img src="https://cdn-uploads.huggingface.co/production/uploads/64545af5ec40bbbd01242ca6/MHmVGOLGh4I5MfQ83iiXS.jpeg">
<p><strong>Creator:</strong> <a href="https://huggingface.co/Steelskull" target="_blank">SteelSkull</a></p>
<p><strong>About Umbra-v3-MoE-4x11b:</strong> A Mixture of Experts model designed for general assistance with a special knack for storytelling and RP/ERP</p>
<p>Integrates models from notable sources for enhanced performance in diverse tasks. this is the two expert version</p>
<p><strong>Source Models:</strong></p>
<ul>
<li><a href="https://huggingface.co/Himitsui/Kaiju-11B">Himitsui/Kaiju-11B</a></li>
<li><a href="https://huggingface.co/Sao10K/Fimbulvetr-11B-v2">Sao10K/Fimbulvetr-11B-v2</a></li>
<li><a href="https://huggingface.co/decapoda-research/Antares-11b-v2">decapoda-research/Antares-11b-v2</a></li>
<li><a href="https://huggingface.co/beberik/Nyxene-v3-11B">beberik/Nyxene-v3-11B</a></li>
</ul>
</div>
<div class="update-section">
<h2>Update-Log:</h2>
<p>The [Umbra Series] keeps rolling out from the [Lumosia Series] garage, aiming to be your digital Alfred with a side of Shakespeare for those RP/ERP nights.</p>
<p><strong>What's Fresh in v3?</strong></p>
<p>Didn’t reinvent the wheel, just slapped on some fancier rims. Upgraded the models and tweaked the prompts a bit. Now, Umbra's not just a general use LLM; it's also focused on spinning stories and "Stories".</p>
<p><strong>Negative Prompt Minimalism</strong></p>
<p>Got the prompts to do a bit of a diet and gym routine—more beef on the positives, trimming down the negatives as usual with a dash of my midnight musings.</p>
<p><strong>Still Guessing, Aren’t We?</strong></p>
<p>Just so we're clear, "v3" is not the messiah of updates. It’s another experiment in the saga.</p>
<p>Dive into Umbra v3 and toss your two cents my way. Your feedback is the caffeine in my code marathon.</p>
</div>
</div>
</body>
</html>