Merges≤10B
Collection
For fast work on weak hardware
•
4 items
•
Updated
This is a merge of pre-trained language models
Designed as part of ru capable 7B.
RP, ERP, chat, it is good and fast. Sometimes hallucinate, sometimes writes excellent from first try.
This one is more stable than v3
Of course, better try at least 12B with offloading, may be slower, but way "smarter" than any 7/8B.
Tested on ChatML t1.01