File size: 501 Bytes
f72c7cc
 
 
 
 
 
fc1f665
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
license: apache-2.0
language:
- en
base_model:
- meta-llama/Llama-2-7b-hf
---
# Model Card for Model ID

<!-- Provide a quick summary of what the model is/does. -->

This repo contains a 2:4 sparse version of the LLaMA2-7B model. Trainied with methods from AAAI25 paper [Pruning Large Language Models with Semi-Structural Adaptive Sparse Training](https://arxiv.org/abs/2407.20584).

### Model Description

Same structured as LLaMA2-7B, but weight from linear layer conform to 2:4 sparse pattern.