--- base_model: - appvoid/palmer-002-32k - raidhon/coven_tiny_1.1b_32k_orpo_alpha - appvoid/palmer-003 library_name: transformers tags: - mergekit - merge --- # palmer palmer-004 a is merge of models targetting to get the performance of palmer-003 all they way to 32k context window. It was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [palmer-002-32k](https://huggingface.co/appvoid/palmer-002-32k) as a base. palmer-004 performs better than coven_tiny_1.1b_32k_orpo_alpha which is the current sota at open-llm-leaderboard, making this one the best overall 1b model on huggingface as of 06/01/2024. The following models were included in the merge: * [coven_tiny_1.1b_32k_orpo_alpha](https://huggingface.co/raidhon/coven_tiny_1.1b_32k_orpo_alpha) * [palmer-003](https://huggingface.co/appvoid/palmer-003)