Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
medmekk
/
SmolLM2-1.7B-Instruct.GGUF
like
0
GGUF
imatrix
conversational
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
SmolLM2-1.7B-Instruct.GGUF
21 GB
1 contributor
History:
2 commits
medmekk
HF Staff
Upload quantized models
01d16a0
verified
10 months ago
.gitattributes
Safe
2.95 kB
Upload quantized models
10 months ago
README.md
Safe
1.4 kB
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-IQ3_M_imat.gguf
810 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-IQ3_XXS_imat.gguf
Safe
680 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-IQ4_NL_imat.gguf
Safe
991 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-IQ4_XS_imat.gguf
Safe
940 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q2_K.gguf
Safe
675 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q3_K_L.gguf
Safe
933 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q3_K_M.gguf
Safe
860 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q3_K_S.gguf
Safe
777 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q4_0.gguf
Safe
991 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q4_K_M.gguf
1.06 GB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q4_K_M_imat.gguf
Safe
1.06 GB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q4_K_S.gguf
Safe
999 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q4_K_S_imat.gguf
999 MB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q5_0.gguf
Safe
1.19 GB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q5_K_M.gguf
1.23 GB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q5_K_M_imat.gguf
Safe
1.23 GB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q5_K_S.gguf
Safe
1.19 GB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q5_K_S_imat.gguf
Safe
1.19 GB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q6_K.gguf
Safe
1.41 GB
xet
Upload quantized models
10 months ago
SmolLM2-1.7B-Instruct-Q8_0.gguf
1.82 GB
xet
Upload quantized models
10 months ago