File size: 348 Bytes
0f588b8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
base_model:
- FallenMerick/Space-Whale-Lite-13B
model_name: Space-Whale-Lite-13B
pipeline_tag: text-generation
tags:
- quantized
- 4-bit
- 5-bit
- 6-bit
- 8-bit
- GGUF
- merge
- frankenmerge
- text-generation
---

# Space-Whale-Lite-13B

  These are GGUF quants for the following model:

https://huggingface.co/FallenMerick/Space-Whale-Lite-13B