Merged LLaMA Model

This is a merged version of the LLaMA2-13b model based on hyperboloid projections. The model retains 31 layers with significant performance retention across all benchmarks.

Downloads last month
7
Safetensors
Model size
10.2B params
Tensor type
FP16
ยท
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for namannn/llama2-13b-hyperbolic-cluster-pruned

Finetuned
(6)
this model
Quantizations
1 model

Space using namannn/llama2-13b-hyperbolic-cluster-pruned 1

Evaluation results