-
-
-
-
-
-
Inference Providers
Active filters:
MoE
Aratako/ELYZA-japanese-Llama-2-MoE-2x7B-v0.1
Text Generation
•
Updated
•
11
Aratako/ELYZA-japanese-Llama-2-MoE-2x7B-v0.1-GGUF
Updated
•
6
•
2
Aratako/ELYZA-japanese-Llama-2-fast-MoE-2x7B-v0.1
Text Generation
•
Updated
•
16
Aratako/ELYZA-japanese-Llama-2-fast-MoE-2x7B-v0.1-GGUF
Aratako/karakuri-lm-chat-upscaled-103b-v0.1-GGUF
JuncaiL/llama-8x265m-moe
Text Generation
•
Updated
•
24
•
2
mradermacher/UNAversal-8x7B-v1beta-GGUF
mradermacher/UNAversal-8x7B-v1beta-i1-GGUF
Updated
•
246
•
1
DavidAU/MoECPM-Untrained-4x2b-Q6_K-GGUF
DavidAU/SOLARC-MOE-10.7Bx6-Q3_K_S-GGUF
Text Generation
•
Updated
•
8
maciek-pioro/Mixtral-8x7B-v0.1-pl
Feature Extraction
•
Updated
•
47
•
5
Skylaude/WizardLM-2-4x7B-MoE
Text Generation
•
Updated
•
23
Skylaude/WizardLM-2-4x7B-MoE-exl2-6_0bpw
Text Generation
•
Updated
•
7
Skylaude/WizardLM-2-4x7B-MoE-exl2-3_0bpw
Text Generation
•
Updated
•
12
•
1
Skylaude/WizardLM-2-4x7B-MoE-exl2-4_25bpw
Text Generation
•
Updated
•
7
Skylaude/WizardLM-2-4x7B-MoE-exl2-3_5bpw
Text Generation
•
Updated
•
6
Skylaude/WizardLM-2-4x7B-MoE-exl2-8_0bpw
Text Generation
•
Updated
•
4
Skylaude/WizardLM-2-4x7B-MoE-exl2-5_0bpw
Text Generation
•
Updated
•
6
OpenDFM/SciDFM-MoE-A5.6B-v1.0
Text Generation
•
Updated
•
28
•
1
mradermacher/SOLARC-MOE-10.7Bx6-GGUF
mradermacher/SOLARC-MOE-10.7Bx6-i1-GGUF
mradermacher/Bioxtral-4x7B-v0.1-GGUF
mradermacher/Bioxtral-4x7B-v0.1-i1-GGUF
Updated
•
105
mradermacher/ELYZA-japanese-Llama-2-MoE-2x13B-v0.1-GGUF
Updated
•
22
mradermacher/Swallow-MoE-2x13B-v0.1-GGUF
Updated
•
93
mradermacher/Swallow-MoE-2x13B-v0.1-i1-GGUF
Updated
•
73
mradermacher/ELYZA-japanese-Llama-2-MoE-2x7B-v0.1-GGUF
Updated
•
99
mradermacher/ELYZA-japanese-Llama-2-fast-MoE-2x7B-v0.1-GGUF
Updated
•
53
llama-moe/LLaMA-MoE-v2-3_8B-2_8-sft
Updated
•
36
•
3
llama-moe/LLaMA-MoE-v2-3_8B-residual-sft
Updated
•
53
•
2