OpenMOSE commited on
Commit
147f740
·
verified ·
1 Parent(s): 5bfffec

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -12,7 +12,7 @@ license: apache-2.0
12
 
13
  ### Model Description
14
 
15
- HRWKV7-Reka-3.1 Flash is an RNN hybrid architecture model that combines RWKV v7's linear attention mechanism with Group Query Attention (GQA) layers. Built upon the Reka-flash3.1 21B foundation, this model replaces most Transformer attention blocks with RWKV blocks while strategically maintaining some GQA layers to enhance performance on specific tasks.
16
 
17
  - **Developed by:** OpenMOSE
18
  - **Model type:** Hybrid Linear-Attention Language Model
 
12
 
13
  ### Model Description
14
 
15
+ RWKV-Reka-3.1 Flash is an RNN hybrid architecture model that combines RWKV v7's linear attention mechanism with Group Query Attention (GQA) layers. Built upon the Reka-flash3.1 21B foundation, this model replaces most Transformer attention blocks with RWKV blocks while strategically maintaining some GQA layers to enhance performance on specific tasks.
16
 
17
  - **Developed by:** OpenMOSE
18
  - **Model type:** Hybrid Linear-Attention Language Model