Update README.md
Browse files
README.md
CHANGED
@@ -69,7 +69,7 @@ DeepCoder generalizes better to long contexts than the base distilled model, due
|
|
69 |
| **DeepCoder-14B-Preview** | 45.6 | 57.9 | 60.6 |
|
70 |
| **DeepSeek-R1-Distill-Qwen-14B** | 50.2 | 53.0 | 53.0 |
|
71 |
|
72 |
-
A more detailed description of the training recipe can be found in our [blog post](https://
|
73 |
|
74 |
## Evaluation
|
75 |
|
@@ -105,7 +105,7 @@ This permissive license ensures that researchers, developers, and enthusiasts wo
|
|
105 |
@misc{deepcoder2025,
|
106 |
title={DeepCoder: A Fully Open-Source 14B Coder at O3-mini Level},
|
107 |
author={Michael Luo, Sijun Tan, Roy Huang, Ameen Patel, Alpay Ariyak, Qingyang Wu, Xiaoxiang Shi, Rachel Xin, Colin Cai, Maurice Weber, Ce Zhang, Li Erran Li, Raluca Ada Popa, Ion Stoica, Tianjun Zhang},
|
108 |
-
howpublished={\url{https://pretty-radio-b75.notion.site/DeepCoder-A-Fully-Open-Source-14B-Coder-at-O3-mini-Level-
|
109 |
note={Notion Blog},
|
110 |
year={2025}
|
111 |
}
|
|
|
69 |
| **DeepCoder-14B-Preview** | 45.6 | 57.9 | 60.6 |
|
70 |
| **DeepSeek-R1-Distill-Qwen-14B** | 50.2 | 53.0 | 53.0 |
|
71 |
|
72 |
+
A more detailed description of the training recipe can be found in our [blog post](https://pretty-radio-b75.notion.site/DeepCoder-A-Fully-Open-Source-14B-Coder-at-O3-mini-Level-1cf81902c14680b3bee5eb349a512a51).
|
73 |
|
74 |
## Evaluation
|
75 |
|
|
|
105 |
@misc{deepcoder2025,
|
106 |
title={DeepCoder: A Fully Open-Source 14B Coder at O3-mini Level},
|
107 |
author={Michael Luo, Sijun Tan, Roy Huang, Ameen Patel, Alpay Ariyak, Qingyang Wu, Xiaoxiang Shi, Rachel Xin, Colin Cai, Maurice Weber, Ce Zhang, Li Erran Li, Raluca Ada Popa, Ion Stoica, Tianjun Zhang},
|
108 |
+
howpublished={\url{https://pretty-radio-b75.notion.site/DeepCoder-A-Fully-Open-Source-14B-Coder-at-O3-mini-Level-1cf81902c14680b3bee5eb349a512a51}},
|
109 |
note={Notion Blog},
|
110 |
year={2025}
|
111 |
}
|