File size: 840 Bytes
882e6de
 
 
 
 
 
 
 
 
 
 
90345bc
882e6de
8ffeb0d
882e6de
276ffe5
 
882e6de
8ffeb0d
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
base_model:
- appvoid/palmer-002-32k
- raidhon/coven_tiny_1.1b_32k_orpo_alpha
- appvoid/palmer-003
library_name: transformers
tags:
- mergekit
- merge

---
# palmer

palmer-004 a is merge of models targetting to get the performance of palmer-003 all they way to 32k context window. It was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [palmer-002-32k](https://huggingface.co/appvoid/palmer-002-32k) as a base.

palmer-004 performs better than coven_tiny_1.1b_32k_orpo_alpha which is the current sota at open-llm-leaderboard, making this one the best overall 1b model on huggingface as of 06/01/2024.

The following models were included in the merge:
* [coven_tiny_1.1b_32k_orpo_alpha](https://huggingface.co/raidhon/coven_tiny_1.1b_32k_orpo_alpha)
* [palmer-003](https://huggingface.co/appvoid/palmer-003)