Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
stabilityai
/
stablelm-2-zephyr-1_6b
like
181
Follow
Stability AI
6,640
Text Generation
Transformers
Safetensors
GGUF
8 datasets
English
stablelm
causal-lm
conversational
Inference Endpoints
arxiv:
2305.18290
License:
other
Model card
Files
Files and versions
Community
19
Train
Deploy
Use this model
main
stablelm-2-zephyr-1_6b
7 contributors
History:
45 commits
pvduy
Update tokenizer_config.json
2f275b1
verified
6 months ago
.gitattributes
1.52 kB
initial commit
10 months ago
LICENSE
7.45 kB
Upload LICENSE
10 months ago
README.md
7.84 kB
update(tokenizer): convert to `GPT2Tokenizer` (#15)
9 months ago
config.json
608 Bytes
revert(config): use `float16` torch dtype
9 months ago
configuration_stablelm.py
9.07 kB
merge: upload transformers implementation (#14)
9 months ago
generation_config.json
121 Bytes
Update generation_config.json
6 months ago
merges.txt
917 kB
update(tokenizer): convert to `GPT2Tokenizer` (#15)
9 months ago
model.safetensors
3.29 GB
LFS
Upload StableLMEpochForCausalLM
10 months ago
modeling_stablelm.py
63.1 kB
merge: upload transformers implementation (#14)
9 months ago
special_tokens_map.json
784 Bytes
update(tokenizer): convert to `GPT2Tokenizer` (#15)
9 months ago
stablelm-2-zephyr-1_6b-OpenVINO-4bit.bin
1.05 GB
LFS
OpenVINO NNCF 4BIT quantization
10 months ago
stablelm-2-zephyr-1_6b-OpenVINO-4bit.xml
2.89 MB
LFS
OpenVINO NNCF 4BIT quantization
10 months ago
stablelm-2-zephyr-1_6b-Q4_0.gguf
983 MB
LFS
GGUF Q4_0, Q4_1, Q8_0 quantized files
10 months ago
stablelm-2-zephyr-1_6b-Q4_1.gguf
1.07 GB
LFS
GGUF Q4_0, Q4_1, Q8_0 quantized files
10 months ago
stablelm-2-zephyr-1_6b-Q5_K_M.gguf
1.19 GB
LFS
GGUF Q5_K_M quantize
10 months ago
stablelm-2-zephyr-1_6b-Q8_0.gguf
1.75 GB
LFS
GGUF Q4_0, Q4_1, Q8_0 quantized files
10 months ago
stablelm-2-zephyr-1_6b.gguf
3.29 GB
LFS
FP16 GGUF file
10 months ago
tokenizer.json
4.24 MB
update(tokenizer): convert to `GPT2Tokenizer` (#15)
9 months ago
tokenizer_config.json
1.4 kB
Update tokenizer_config.json
6 months ago
vocab.json
2.01 MB
update(tokenizer): convert to `GPT2Tokenizer` (#15)
9 months ago