Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Mozilla
/
Meta-Llama-3.1-8B-Instruct-llamafile
like
37
Follow
mozilla
156
llamafile
PyTorch
8 languages
facebook
meta
llama
llama-3
arxiv:
2204.05149
License:
llama3.1
Model card
Files
Files and versions
Community
bcbff6f
Meta-Llama-3.1-8B-Instruct-llamafile
1 contributor
History:
34 commits
jartine
Quantize Q2_K with llamafile-0.8.13
bcbff6f
verified
3 months ago
.gitattributes
Safe
2.78 kB
Quantize Q5_1 with llamafile-0.8.11
4 months ago
LICENSE
Safe
17.3 kB
Update LICENSE
4 months ago
Meta-Llama-3.1-8B-Instruct.BF16.llamafile
Safe
16.1 GB
LFS
Quantize BF16 with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.F16.llamafile
Safe
16.1 GB
LFS
Quantize F16 with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.Q2_K.llamafile
Safe
3.42 GB
LFS
Quantize Q2_K with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-8B-Instruct.Q3_K_L.llamafile
Safe
4.35 GB
LFS
Quantize Q3_K_L with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.Q3_K_M.llamafile
Safe
4.26 GB
LFS
Quantize Q3_K_M with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-8B-Instruct.Q3_K_S.llamafile
Safe
3.69 GB
LFS
Quantize Q3_K_S with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.Q4_0.llamafile
Safe
4.69 GB
LFS
Quantize Q4_0 with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.Q4_1.llamafile
Safe
5.16 GB
LFS
Quantize Q4_1 with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.Q4_K_M.llamafile
Safe
5.16 GB
LFS
Quantize Q4_K_M with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-8B-Instruct.Q4_K_S.llamafile
Safe
4.72 GB
LFS
Quantize Q4_K_S with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.Q5_0.llamafile
Safe
5.84 GB
LFS
Quantize Q5_0 with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-8B-Instruct.Q5_1.llamafile
Safe
6.1 GB
LFS
Quantize Q5_1 with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.Q5_K_M.llamafile
Safe
5.76 GB
LFS
Quantize Q5_K_M with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.Q5_K_S.llamafile
Safe
5.63 GB
LFS
Quantize Q5_K_S with llamafile-0.8.11
4 months ago
Meta-Llama-3.1-8B-Instruct.Q6_K.llamafile
Safe
6.84 GB
LFS
Quantize Q6_K with llamafile-0.8.13
3 months ago
Meta-Llama-3.1-8B-Instruct.Q8_0.llamafile
Safe
8.57 GB
LFS
Quantize Q8_0 with llamafile-0.8.11
4 months ago
README.md
Safe
31.3 kB
Update README.md
4 months ago