File size: 987 Bytes
b0191a7
 
8dc4ca4
 
 
 
 
b0191a7
 
2e7a507
b0191a7
e150d9c
 
 
b0191a7
e150d9c
 
b0191a7
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
---
library_name: transformers
tags:
- code
- hpc
- parallel
- axonn
---

# HPC-Coder-v2

The HPC-Coder-v2-6.7b model is an HPC code LLM fine-tuned on an instruction dataset catered to common HPC topics such as parallelism, optimization, accelerator porting, etc.
This version is a fine-tuning of the [Deepseek Coder 6.7b](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-base) model. 
It is fine-tuned on the [hpc-synthetic](https://huggingface.co/datasets/hpcgroup/hpc-synthetic), [oss-instruct](https://huggingface.co/datasets/ise-uiuc/Magicoder-OSS-Instruct-75K), and [evol-instruct](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1) datasets.

HPC-Coder-v2-6.7b is the best performing LLM under 30b parameters on the [ParEval](https://github.com/parallelcodefoundry/ParEval) parallel code generation benchmark in terms of _correctness_ and _performance_.
It scores similarly to 34B and commercial models like Phind-V2 and GPT-4 on parallel code generation.