File size: 1,009 Bytes
4634c4b 963b245 4634c4b 963b245 4634c4b 963b245 4634c4b 963b245 4634c4b 963b245 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
---
license: apache-2.0
language:
- en
base_model:
- meta-llama/Llama-3.1-8B-instruct
pipeline_tag: text-generation
tags:
- lora
- adapter
- writing
- CoT
---
## Model Details
- Base Model: meta-llama/Llama-3.1-8B-instruct
## Merger Configuration
### Source Adapters
All source adapters share the following configuration:
- Rank (r): 16
- Alpha: 16
- Target Modules:
- q_proj (Query projection)
- k_proj (Key projection)
- v_proj (Value projection)
- o_proj (Output projection)
- up_proj (Upsampling projection)
- down_proj (Downsampling projection)
- gate_proj (Gate projection)
- The order of loading adapters may affect the final result
- Equal weights were chosen to maintain balanced influence from each adapter
- The merged adapter maintains the same architecture and rank as the original adapters
- While this adapter merges multiple fine-tunes, each component was developed as part of independent research efforts to explore and language model capabilities as part of R&D process.
|