license: apache-2.0 | |
base_model: | |
- BAAI/Emu3-Gen | |
library_name: transformers | |
tags: | |
- merge | |
This is an interpolated upscale of [BAAI/Emu3-Gen](https://huggingface.co/BAAI/Emu3-Gen) from 8B to 11.5B. | |
For each layer in [7,8,9,10,11,12,13,14,15,16,22,23,24], the weights were lerp'd between the previous layer and the current and inserted between the two. | |
Expansion script is [here](https://huggingface.co/lodrick-the-lafted/Emu3-Gen-12B/blob/main/emu3_expand.py). |