Join our Discord! https://discord.gg/BeaverAI

More than 7000 members strong πŸ’ͺ A hub for users and makers alike!


Drummer is open for work / employment (I'm a Software Engineer). Contact me through any of these channels: https://linktr.ee/thelocaldrummer

Thank you to everyone who subscribed through Patreon. Your suppprt helps me chug along in this brave new world.


Drummer proudly presents...

Behemoth X 123B v2 🦣

Usage

  • Mistral v7 (Non-Tekken) | (i.e., Mistral v3 + [SYSTEM_PROMPT] )
  • Non-reasoning model

Description

Seems to pass a secrecy test in a few gens. Successfully tracked who share a secret among 6 characters and who don't. Really liking the prose. Logic is mostly decent.

so far its 5 out of 5. made me cry. would let it stab me in the feels again.

This is one of the moments where I really enjoyed reading the generation

Recall is fantastic in v2b. I had a response just now that pulled in like 20 minor details. It was nuts. I'm at ~100 gens in an RP now and v2b has been perfect throughout so far. Maybe you hit on some magic like midnight miqu, I dunno.

Language choice is better than OG too. That's what Monstral gave that I liked so much. My card is an evil character and she is keeping it hidden so so well. Laying the trap slowly, gaining trust. It's just amazing to watch. If this keeps up this might be your best model ever imo

I mostly do mixed co-narrator/mc RP content on my phone. I'm so deep into one storyline it takes a minute of furious scrolling to reach back at the top, and it's still going coherent. No templates, just a lightweight sys prompt. Great model, I'd hate to go back to 24B or even 70B from here. It recognized the video game Katawa Shoujo and incorporated character flaws seamlessly into my story.

Links

config-v2b

Downloads last month
14
Safetensors
Model size
123B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for tachyphylaxis/Behemoth-X-123B-v2

Finetuned
(15)
this model
Quantizations
2 models