Gemma-2b-ties

Gemma-2b-ties is a merge of the following models using mergekit:

🧩 Configuration

```yaml models:

  • model: unsloth/gemma-2b-bnb-4bit
  • model: jiayihao03/gemma2b_code_java parameters: density: 0.5 weight: 0.3
  • model: jiayihao03/gemma_2b_code_python_4bit parameters: density: 0.5 weight: 0.3 merge_method: ties base_model: unsloth/gemma-2b-bnb-4bit parameters: normalize: true dtype: float16 ```
Downloads last month
5
Safetensors
Model size
1.52B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.