Update README.md
Browse files
README.md
CHANGED
@@ -114,7 +114,7 @@ This is the third-generation model of the **ZYH-LLM series**.
|
|
114 |
|
115 |
It employs a large amount of model merging techniques, aiming to provide a **powerful and unified 14-billion-parameter model**, laying a solid foundation for further model merging and model fine-tuning.
|
116 |
|
117 |
-
|
118 |

|
119 |
|
120 |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard)
|
|
|
114 |
|
115 |
It employs a large amount of model merging techniques, aiming to provide a **powerful and unified 14-billion-parameter model**, laying a solid foundation for further model merging and model fine-tuning.
|
116 |
|
117 |
+
## As of February 25, 2025, the 14B model with the highest IFEval score
|
118 |

|
119 |
|
120 |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard)
|