Aggregat_18.5B

This is a merge of pre-trained language models

Goal of this merge was to explore limits of 12b merging

This merge is surprisingly stable, it's logical capapabilities are ok, already good prose of Badman and Unity are enriched by some layers of Delta-Vector/Rei-12B.

It still need some more testing.

Use ChatML

Tested on T 1.01

Downloads last month
22
Safetensors
Model size
18.5B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for OddTheGreat/Aggregat_18.5B

Collection including OddTheGreat/Aggregat_18.5B