Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
QuantStack
/
InternVL3_5-38B-gguf
like
1
Follow
QuantStack
1.34k
GGUF
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
c25816a
InternVL3_5-38B-gguf
124 GB
1 contributor
History:
16 commits
wsbagnsv1
Upload InternVL3_5-38b-q3_k_m.gguf with huggingface_hub
c25816a
verified
2 months ago
.gitattributes
2.23 kB
Upload InternVL3_5-38b-q3_k_m.gguf with huggingface_hub
2 months ago
InternVL3_5-38B-iq4_xs.gguf
Safe
17.9 GB
xet
Upload InternVL3_5-38B-iq4_xs.gguf
2 months ago
InternVL3_5-38b-q2_k.gguf
Safe
12.3 GB
xet
Rename internvl3_5-38b-q2_k.gguf to InternVL3_5-38b-q2_k.gguf
2 months ago
InternVL3_5-38b-q3_k_m.gguf
Safe
16 GB
xet
Upload InternVL3_5-38b-q3_k_m.gguf with huggingface_hub
2 months ago
InternVL3_5-38b-q3_k_s.gguf
Safe
14.4 GB
xet
Rename internvl3_5-38b-q3_k_s.gguf to InternVL3_5-38b-q3_k_s.gguf
2 months ago
InternVL3_5-38b-q8_0.gguf
Safe
34.8 GB
xet
Rename internvl3_5-38b-q8_0.gguf to InternVL3_5-38b-q8_0.gguf
2 months ago
README.md
Safe
324 Bytes
Update README.md
2 months ago
mmproj-InternVL3_5-38B-bf16.gguf
Safe
11.3 GB
xet
Upload 2 files
2 months ago
mmproj-InternVL3_5-38B-f16.gguf
Safe
11.3 GB
xet
Upload 2 files
2 months ago
mmproj-InternVL3_5-38B-q8_0.gguf
Safe
6 GB
xet
Upload mmproj-InternVL3_5-38B-q8_0.gguf
2 months ago