Crazy Reasoning Qwen2.5 7B 🧠

A funnny reasoning model, trained on a manual high-quality dataset! Don't use it in real problems.

He has "Secret Thoughts", after which he gives out an "Answer".

Officially supports English, Russian, Chinese, French, and Spanish (it has been in training dataset), but you can try other!

  • Context Length: 32,768.
  • Dataset: Custom without syntetic.
  • Prompt Example: "Hello!".
  • System Prompt: Set your language and other instructions for better perfomance, you can!

This model is best used for entertainment purposes (it will be funny if it actually works better in other things too). Author - FGOTYT (me).

Downloads last month
59
GGUF
Model size
7.62B params
Architecture
qwen2

4-bit

5-bit

6-bit

8-bit

Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for FGOTYT/Crazy_Reasoning_Qwen2.5_7B

Base model

Qwen/Qwen2.5-7B
Quantized
(139)
this model

Collection including FGOTYT/Crazy_Reasoning_Qwen2.5_7B