Yellowtree's picture
Update README.md
fc1f665 verified
metadata
license: apache-2.0
language:
  - en
base_model:
  - meta-llama/Llama-2-7b-hf

Model Card for Model ID

This repo contains a 2:4 sparse version of the LLaMA2-7B model. Trainied with methods from AAAI25 paper Pruning Large Language Models with Semi-Structural Adaptive Sparse Training.

Model Description

Same structured as LLaMA2-7B, but weight from linear layer conform to 2:4 sparse pattern.