|
--- |
|
base_model: |
|
- Qwen/Qwen2.5-32B |
|
- deepseek-ai/DeepSeek-R1-Distill-Qwen-32B |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- peft |
|
|
|
--- |
|
# DeepSeek-R1-Distill-Qwen-32B 64R LoRA |
|
|
|
This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit). |
|
|
|
## LoRA Details |
|
|
|
This LoRA adapter was extracted from [deepseek-ai/DeepSeek-R1-Distill-Qwen-32B](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B) and uses [Qwen/Qwen2.5-32B](https://huggingface.co/Qwen/Qwen2.5-32B) as a base. |
|
|
|
### Parameters |
|
|
|
The following command was used to extract this LoRA adapter: |
|
|
|
```sh |
|
mergekit-extract-lora deepseek-ai/DeepSeek-R1-Distill-Qwen-32B Qwen/Qwen2.5-32B OUTPUT_PATH --no-lazy-unpickle --rank=64 --device=cpu |
|
``` |
|
|