Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
silveroxides
/
Chroma-Misc-Models
like
14
GGUF
Model card
Files
Files and versions
xet
Community
3
No model card
Downloads last month
2,183
GGUF
Model size
8.9B params
Architecture
flux
Hardware compatibility
Log In
to view the estimation
4-bit
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.78 GB
Q4_0
5.78 GB
Q4_0
5.78 GB
Q4_0
5.78 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_0
5.43 GB
Q4_K_M
5.91 GB
Q4_K_M
5.91 GB
Q4_K_M
5.91 GB
6-bit
Q6_K
8.15 GB
Q6_K
8.15 GB
Q6_K
8.15 GB
8-bit
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
10.4 GB
Q8_0
10.4 GB
Q8_0
10.4 GB
Q8_0
10.4 GB
Q8_0
10.4 GB
Q8_0
10.4 GB
Q8_0
10.4 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.73 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
Q8_0
9.74 GB
16-bit
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
19 GB
BF16
19 GB
BF16
19 GB
BF16
19 GB
BF16
19 GB
BF16
19 GB
BF16
19 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
BF16
17.8 GB
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support