Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
knifeayumu
/
LLM_Collection
like
5
GGUF
Inference Endpoints
Model card
Files
Files and versions
Community
1
Deploy
Use this model
060033b
LLM_Collection
/
Unused
1 contributor
History:
19 commits
knifeayumu
Rename WestLake-7B-v2-Q6_K.gguf to Unused/WestLake-7B-v2-Q6_K.gguf
060033b
verified
6 months ago
Eris_7B-Q6_K-imatrix.gguf
Safe
5.94 GB
LFS
Rename Eris_7B-Q6_K-imatrix.gguf to Unused/Eris_7B-Q6_K-imatrix.gguf
6 months ago
Erosumika-7B.q8_0.gguf
Safe
7.7 GB
LFS
Rename Erosumika-7B.q8_0.gguf to Unused/Erosumika-7B.q8_0.gguf
6 months ago
Fett-uccine-7B-Q6_K.gguf
Safe
5.94 GB
LFS
Rename Fett-uccine-7B-Q6_K.gguf to Unused/Fett-uccine-7B-Q6_K.gguf
6 months ago
Fimbulvetr-10.7B-v1.q6_K.gguf
Safe
8.81 GB
LFS
Rename Fimbulvetr-10.7B-v1.q6_K.gguf to Unused/Fimbulvetr-10.7B-v1.q6_K.gguf
6 months ago
LLaMA2-13B-Psyfighter2.Q5_K_M.gguf
Safe
9.23 GB
LFS
Rename LLaMA2-13B-Psyfighter2.Q5_K_M.gguf to Unused/LLaMA2-13B-Psyfighter2.Q5_K_M.gguf
6 months ago
LLaMA2-13B-Tiefighter.Q5_K_M.gguf
Safe
9.23 GB
LFS
Rename LLaMA2-13B-Tiefighter.Q5_K_M.gguf to Unused/LLaMA2-13B-Tiefighter.Q5_K_M.gguf
6 months ago
Llama-3-Lumimaid-8B-v0.1-OAS.q8_0.gguf
Safe
8.54 GB
LFS
Rename Llama-3-Lumimaid-8B-v0.1-OAS.q8_0.gguf to Unused/Llama-3-Lumimaid-8B-v0.1-OAS.q8_0.gguf
6 months ago
Llama-3-Soliloquy-8B-v2.Q8_0.gguf
Safe
8.54 GB
LFS
Rename Llama-3-Soliloquy-8B-v2.Q8_0.gguf to Unused/Llama-3-Soliloquy-8B-v2.Q8_0.gguf
6 months ago
Llama-3some-8B-v1-rc1-Q8_0.gguf
Safe
8.54 GB
LFS
Rename Llama-3some-8B-v1-rc1-Q8_0.gguf to Unused/Llama-3some-8B-v1-rc1-Q8_0.gguf
6 months ago
NeteLegacy-13B.q5_k_m.gguf
Safe
9.23 GB
LFS
Rename NeteLegacy-13B.q5_k_m.gguf to Unused/NeteLegacy-13B.q5_k_m.gguf
6 months ago
Noromaid-13b-v0.1.1.q5_k_m.gguf
Safe
9.23 GB
LFS
Rename Noromaid-13b-v0.1.1.q5_k_m.gguf to Unused/Noromaid-13b-v0.1.1.q5_k_m.gguf
6 months ago
Noromaid-13b-v0.2.q5_k_m.gguf
Safe
9.23 GB
LFS
Rename Noromaid-13b-v0.2.q5_k_m.gguf to Unused/Noromaid-13b-v0.2.q5_k_m.gguf
6 months ago
Nous-Hermes-2-Mistral-7B-DPO.Q6_K.gguf
Safe
5.94 GB
LFS
Rename Nous-Hermes-2-Mistral-7B-DPO.Q6_K.gguf to Unused/Nous-Hermes-2-Mistral-7B-DPO.Q6_K.gguf
6 months ago
Prodigy_7B-Q6_K-imatrix.gguf
Safe
5.94 GB
LFS
Rename Prodigy_7B-Q6_K-imatrix.gguf to Unused/Prodigy_7B-Q6_K-imatrix.gguf
6 months ago
ShoriRP.v077.q6_k.gguf
Safe
5.94 GB
LFS
Rename ShoriRP.v077.q6_k.gguf to Unused/ShoriRP.v077.q6_k.gguf
6 months ago
Starling-LM-7B-beta-Q6_K.gguf
Safe
5.94 GB
LFS
Rename Starling-LM-7B-beta-Q6_K.gguf to Unused/Starling-LM-7B-beta-Q6_K.gguf
6 months ago
Toppy-M-7B.q6_k.gguf
Safe
5.94 GB
LFS
Rename Toppy-M-7B.q6_k.gguf to Unused/Toppy-M-7B.q6_k.gguf
6 months ago
WestLake-7B-v2-Q6_K.gguf
Safe
5.94 GB
LFS
Rename WestLake-7B-v2-Q6_K.gguf to Unused/WestLake-7B-v2-Q6_K.gguf
6 months ago
unused.md
0 Bytes
Create Unused/unused.md
6 months ago