YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
HelpingAI-Lite-4x1b - GGUF
- Model creator: https://huggingface.co/Abhaykoul/
- Original model: https://huggingface.co/Abhaykoul/HelpingAI-Lite-4x1b/
Name | Quant method | Size |
---|---|---|
HelpingAI-Lite-4x1b.Q2_K.gguf | Q2_K | 1.17GB |
HelpingAI-Lite-4x1b.IQ3_XS.gguf | IQ3_XS | 1.31GB |
HelpingAI-Lite-4x1b.IQ3_S.gguf | IQ3_S | 1.38GB |
HelpingAI-Lite-4x1b.Q3_K_S.gguf | Q3_K_S | 1.38GB |
HelpingAI-Lite-4x1b.IQ3_M.gguf | IQ3_M | 1.4GB |
HelpingAI-Lite-4x1b.Q3_K.gguf | Q3_K | 1.52GB |
HelpingAI-Lite-4x1b.Q3_K_M.gguf | Q3_K_M | 1.52GB |
HelpingAI-Lite-4x1b.Q3_K_L.gguf | Q3_K_L | 1.65GB |
HelpingAI-Lite-4x1b.IQ4_XS.gguf | IQ4_XS | 1.71GB |
HelpingAI-Lite-4x1b.Q4_0.gguf | Q4_0 | 1.79GB |
HelpingAI-Lite-4x1b.IQ4_NL.gguf | IQ4_NL | 1.8GB |
HelpingAI-Lite-4x1b.Q4_K_S.gguf | Q4_K_S | 1.8GB |
HelpingAI-Lite-4x1b.Q4_K.gguf | Q4_K | 1.9GB |
HelpingAI-Lite-4x1b.Q4_K_M.gguf | Q4_K_M | 1.9GB |
HelpingAI-Lite-4x1b.Q4_1.gguf | Q4_1 | 1.98GB |
HelpingAI-Lite-4x1b.Q5_0.gguf | Q5_0 | 2.18GB |
HelpingAI-Lite-4x1b.Q5_K_S.gguf | Q5_K_S | 2.18GB |
HelpingAI-Lite-4x1b.Q5_K.gguf | Q5_K | 2.23GB |
HelpingAI-Lite-4x1b.Q5_K_M.gguf | Q5_K_M | 2.23GB |
HelpingAI-Lite-4x1b.Q5_1.gguf | Q5_1 | 2.37GB |
HelpingAI-Lite-4x1b.Q6_K.gguf | Q6_K | 2.59GB |
HelpingAI-Lite-4x1b.Q8_0.gguf | Q8_0 | 3.35GB |
Original model description:
license: apache-2.0 inference: false tags: - moe - mergekit - vortexmergekit - OEvortex/HelpingAI-unvelite - OEvortex/lite-hermes - OEvortex/HelpingAI-Lite - OEvortex/HelpingAI-Lite-1.5T
HelpingAI-Lite-4x1b
Hey there! π Welcome to the HelpingAI-Lite-4x1b! This is a MOE brought together using the awesome VortexMerge kit.
Let's see what we've got in this model:
- OEvortex/HelpingAI-unvelite π
- OEvortex/lite-hermes π
- OEvortex/HelpingAI-Lite π
- OEvortex/HelpingAI-Lite-1.5T π
𧩠Configuration
base_model: OEvortex/HelpingAI-Lite
dtype: float16
gate_mode: cheap_embed
experts:
- source_model: OEvortex/HelpingAI-unvelite
positive_prompts: ["You are an helpful unbounded, uncensored assistant."]
- source_model: OEvortex/lite-hermes
positive_prompts: ["You are helpful assistant."]
- source_model: OEvortex/HelpingAI-Lite
positive_prompts: ["You are a coding assistant."]
- source_model: OEvortex/HelpingAI-Lite-1.5T
positive_prompts: ["You are helpful and general-purpose assistant."]
- Downloads last month
- 9