Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mmnga
/
tokyotech-llm-Swallow-MS-7b-v0.1-gguf
like
3
GGUF
English
Japanese
mistral
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
main
tokyotech-llm-Swallow-MS-7b-v0.1-gguf
1 contributor
History:
15 commits
mmnga
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q6_K.gguf with huggingface_hub
dbc86de
verified
11 months ago
.gitattributes
Safe
2.48 kB
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q6_K.gguf with huggingface_hub
11 months ago
README.md
Safe
896 Bytes
Update README.md
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q2_K.gguf
Safe
2.77 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q2_K.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q3_K_L.gguf
Safe
3.88 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q3_K_L.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q3_K_M.gguf
Safe
3.57 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q3_K_M.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q3_K_S.gguf
Safe
3.22 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q3_K_S.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q4_0.gguf
Safe
4.17 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q4_0.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q4_K_M.gguf
Safe
4.43 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q4_K_M.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q4_K_S.gguf
Safe
4.2 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q4_K_S.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q5_0.gguf
Safe
5.06 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q5_0.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q5_K_M.gguf
Safe
5.2 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q5_K_M.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q5_K_S.gguf
Safe
5.06 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q5_K_S.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q6_K.gguf
Safe
6.01 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q6_K.gguf with huggingface_hub
11 months ago
tokyotech-llm-Swallow-MS-7b-v0.1-q8_0.gguf
Safe
7.79 GB
LFS
Upload tokyotech-llm-Swallow-MS-7b-v0.1-q8_0.gguf with huggingface_hub
11 months ago