Text Generation
Transformers
Safetensors
English
llama
code
hpc
parallel
axonn
text-generation-inference
hpc-coder-v2-6.7b / README.md
daniellnichols's picture
Update README.md
33d2e15 verified
|
raw
history blame
1.12 kB
metadata
library_name: transformers
tags:
  - code
  - hpc
  - parallel
  - axonn

HPC-Coder-v2

The HPC-Coder-v2-6.7b model is an HPC code LLM fine-tuned on an instruction dataset catered to common HPC topics such as parallelism, optimization, accelerator porting, etc. This version is a fine-tuning of the Deepseek Coder 6.7b model. It is fine-tuned on the hpc-synthetic, oss-instruct, and evol-instruct datasets. We utilized the distributed training library AxoNN to fine-tune in parallel across many GPUs.

HPC-Coder-v2-6.7b is the best performing LLM under 30b parameters on the ParEval parallel code generation benchmark in terms of correctness and performance. It scores similarly to 34B and commercial models like Phind-V2 and GPT-4 on parallel code generation.