Swallow-MoE-2x13B-v0.1-GGUF

概要

Aratako/Swallow-MoE-2x13B-v0.1の量子化済みGGUF版です。ライセンス等詳細は元モデルをご確認ください。

現在はQ4_K_Mのみです。需要ありそうであれば他のものも用意します。

Description

This is the quantized GGUF version of Aratako/Swallow-MoE-2x13B-v0.1. Please refer to the original model for license details and more information.

Currently, only Q4_K_M is available. If there is demand, other versions may be provided as well.

Downloads last month
2
GGUF
Model size
21.6B params
Architecture
llama

4-bit

Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Model tree for Aratako/Swallow-MoE-2x13B-v0.1-GGUF